mirror of
https://dev.iopsys.eu/bbf/bbfdm.git
synced 2025-12-10 07:44:39 +01:00
T#13964: Generate single datamodel.json from datamodel xmls
This commit is contained in:
parent
5922644a43
commit
6332fe98e1
21 changed files with 23331 additions and 14582 deletions
1
.gitignore
vendored
1
.gitignore
vendored
|
|
@ -9,3 +9,4 @@ bbfdmd/ubus/bbfdmd
|
||||||
docs/index.md
|
docs/index.md
|
||||||
__pycache__
|
__pycache__
|
||||||
out
|
out
|
||||||
|
/datamodel
|
||||||
|
|
|
||||||
|
|
@ -2,7 +2,7 @@
|
||||||
|
|
||||||
It is often required to extend datamodel parameters of the device, extending the datamodel parameters using json plugin is the simplest way for the same.
|
It is often required to extend datamodel parameters of the device, extending the datamodel parameters using json plugin is the simplest way for the same.
|
||||||
|
|
||||||
To extend the datamodel using json plugin, its required to be defined it, as it is defined in `TR181.json` file and then place that json in '/etc/bbfdm/json/' directory of device.
|
To extend the datamodel using json plugin, its required to be defined it, as it is defined in `datamodel.json` file and then place that json in '/etc/bbfdm/json/' directory of device.
|
||||||
|
|
||||||
It is often the case, that the supported mapping might not handle all the scenarios, and required some structural changes to fulfill the new requirements, to make these plugins backward compatible with the older mappings some kind of check was required, which is can be solved with having a "version" field in the plugin, which describes the list of supported mappings with that specific version. This can be added as below:
|
It is often the case, that the supported mapping might not handle all the scenarios, and required some structural changes to fulfill the new requirements, to make these plugins backward compatible with the older mappings some kind of check was required, which is can be solved with having a "version" field in the plugin, which describes the list of supported mappings with that specific version. This can be added as below:
|
||||||
```json
|
```json
|
||||||
|
|
@ -130,7 +130,7 @@ And on these tree components, we can do:
|
||||||
- Del
|
- Del
|
||||||
- Operate/commands
|
- Operate/commands
|
||||||
|
|
||||||
If we skip multi-instance objects for some time, everything else is stand-along entity, I mean, one parameter is having one specific information, so for those parameters information could be fetched from `uci/ubus` or some external `cli` command. The `TR181.json` has all the required the information about the parameters except how to get it from the device. In json plugin, we solve that by introducing a new element in the tree called mapping, which describes how to process the operations on that specific datamodel parameter.
|
If we skip multi-instance objects for some time, everything else is stand-along entity, I mean, one parameter is having one specific information, so for those parameters information could be fetched from `uci/ubus` or some external `cli` command. The `datamodel.json` has all the required the information about the parameters except how to get it from the device. In json plugin, we solve that by introducing a new element in the tree called mapping, which describes how to process the operations on that specific datamodel parameter.
|
||||||
|
|
||||||
```json
|
```json
|
||||||
mapping: [
|
mapping: [
|
||||||
|
|
|
||||||
|
|
@ -2,23 +2,16 @@
|
||||||
|
|
||||||
As mentioned in README, all Data Models are stored in the **'dmtree'** folder. In order to implement a new object/parameter, you need to expand its get/set/add/delete functions and then save them in the right folder.
|
As mentioned in README, all Data Models are stored in the **'dmtree'** folder. In order to implement a new object/parameter, you need to expand its get/set/add/delete functions and then save them in the right folder.
|
||||||
|
|
||||||
`bbfdm` library offers a tool to generate templates of the source code from json files placed under **'dmtree/json'**. So, any developer can fill these json files ([tr181](../../libbbfdm/dmtree/json/tr181.json) or [tr104](../../libbbfdm/dmtree/json/tr104.json)) with mapping field according to UCI, UBUS or CLI commands then generate the source code in C.
|
`bbfdm` library offers a tool to generate templates of the source code from json files placed under **'dmtree/json'**. So, any developer can fill the data model json file [data_model](../../libbbfdm/dmtree/json/datamodel.json) with mapping field according to UCI, UBUS or CLI commands then generate the source code in C.
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
$ ./convert_dm_json_to_c.py
|
$ ./convert_dm_json_to_c.py
|
||||||
Usage: convert_dm_json_to_c.py <data model name> [Object path]
|
Usage: ./tools/convert_dm_json_to_c.py [Object path]
|
||||||
data model name: The data model(s) to be used, for ex: tr181 or tr181,tr104
|
|
||||||
Examples:
|
Examples:
|
||||||
- convert_dm_json_to_c.py tr181
|
- ./tools/convert_dm_json_to_c.py
|
||||||
==> Generate the C code of tr181 data model in datamodel/ folder
|
==> Generate the C code of full data model in datamodel/ folder
|
||||||
- convert_dm_json_to_c.py tr104
|
- ./tools/convert_dm_json_to_c.py Device.DeviceInfo.
|
||||||
==> Generate the C code of tr104 data model in datamodel/ folder
|
|
||||||
- convert_dm_json_to_c.py tr181,tr104
|
|
||||||
==> Generate the C code of tr181 and tr104 data model in datamodel/ folder
|
|
||||||
- convert_dm_json_to_c.py tr181 Device.DeviceInfo.
|
|
||||||
==> Generate the C code of Device.DeviceInfo object in datamodel/ folder
|
==> Generate the C code of Device.DeviceInfo object in datamodel/ folder
|
||||||
- convert_dm_json_to_c.py tr104 Device.Services.VoiceService.{i}.Capabilities.
|
|
||||||
==> Generate the C code of Device.Services.VoiceService.{i}.Capabilities. object in datamodel/ folder
|
|
||||||
```
|
```
|
||||||
|
|
||||||
Below some examples of **UCI**, **UBUS** or **CLI** mappings:
|
Below some examples of **UCI**, **UBUS** or **CLI** mappings:
|
||||||
|
|
|
||||||
|
|
@ -14,12 +14,8 @@ exec_cmd_verbose pylint -d R,C,W0603 tools/*.py
|
||||||
|
|
||||||
echo "********* Validate JSON Plugin *********"
|
echo "********* Validate JSON Plugin *********"
|
||||||
|
|
||||||
echo "Validate BBF TR-181 JSON Plugin"
|
echo "Validate BBF Data Model JSON Plugin"
|
||||||
./tools/validate_json_plugin.py libbbfdm/dmtree/json/tr181.json
|
./tools/validate_json_plugin.py libbbfdm/dmtree/json/datamodel.json
|
||||||
check_ret $?
|
|
||||||
|
|
||||||
echo "Validate BBF TR-104 JSON Plugin"
|
|
||||||
./tools/validate_json_plugin.py libbbfdm/dmtree/json/tr104.json
|
|
||||||
check_ret $?
|
check_ret $?
|
||||||
|
|
||||||
echo "Validate X_IOPSYS_EU_Dropbear JSON Plugin"
|
echo "Validate X_IOPSYS_EU_Dropbear JSON Plugin"
|
||||||
|
|
@ -50,18 +46,8 @@ echo "Validate test overwrite Plugin"
|
||||||
./tools/validate_json_plugin.py test/vendor_test/test_overwrite.json
|
./tools/validate_json_plugin.py test/vendor_test/test_overwrite.json
|
||||||
check_ret $?
|
check_ret $?
|
||||||
|
|
||||||
echo "Validate TR-181 JSON Plugin after generating from XML"
|
echo "Validate Data Model JSON Plugin after generating from TR-181, TR-104 and TR-135 XML Files"
|
||||||
json_path=$(./tools/convert_dm_xml_to_json.py test/tools/tr-181-2-*-cwmp-full.xml test/tools/tr-181-2-*-usp-full.xml Device.)
|
json_path=$(./tools/convert_dm_xml_to_json.py -d test/tools/)
|
||||||
./tools/validate_json_plugin.py $json_path
|
|
||||||
check_ret $?
|
|
||||||
|
|
||||||
echo "Validate TR-104 JSON Plugin after generating from XML"
|
|
||||||
json_path=$(./tools/convert_dm_xml_to_json.py test/tools/tr-104-2-0-2-cwmp-full.xml test/tools/tr-104-2-0-2-usp-full.xml Device.Services.VoiceService.)
|
|
||||||
./tools/validate_json_plugin.py $json_path
|
|
||||||
check_ret $?
|
|
||||||
|
|
||||||
echo "Validate TR-135 JSON Plugin after generating from XML"
|
|
||||||
json_path=$(./tools/convert_dm_xml_to_json.py test/tools/tr-135-1-4-1-cwmp-full.xml test/tools/tr-135-1-4-1-usp-full.xml Device.Services.STBService.)
|
|
||||||
./tools/validate_json_plugin.py $json_path
|
./tools/validate_json_plugin.py $json_path
|
||||||
check_ret $?
|
check_ret $?
|
||||||
|
|
||||||
|
|
|
||||||
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
|
|
@ -60,12 +60,12 @@ This tools can be used as shown below
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
$ ./tools/convert_dm_xml_to_json.py
|
$ ./tools/convert_dm_xml_to_json.py
|
||||||
Usage: ./tools/convert_dm_xml_to_json.py <tr-xxx cwmp xml data model> <tr-xxx usp xml data model> [Object path]
|
Usage: python convert_dm_xml_to_json -d <directory>
|
||||||
Examples:
|
Options:
|
||||||
- ./tools/convert_dm_xml_to_json.py test/tools/tr-181-2-*-cwmp-full.xml test/tools/tr-181-2-*-usp-full.xml Device.
|
-d, --directory <directory>: Directory containing XML files to convert to JSON
|
||||||
==> Generate the json file of the sub tree Device. in tr181.json
|
Example:
|
||||||
- ./tools/convert_dm_xml_to_json.py test/tools/tr-104-2-0-2-cwmp-full.xml test/tools/tr-104-2-0-2-usp-full.xml Device.Services.VoiceService.
|
./tools/convert_dm_xml_to_json.py -d test/tools/
|
||||||
==> Generate the json file of the sub tree Device.Services.VoiceService. in tr104.json
|
==> Generate the JSON file containing of all XML files defined under test/tools/ directory in datamodel.json
|
||||||
|
|
||||||
Example of xml data model file: https://www.broadband-forum.org/cwmp/tr-181-2-*-cwmp-full.xml
|
Example of xml data model file: https://www.broadband-forum.org/cwmp/tr-181-2-*-cwmp-full.xml
|
||||||
```
|
```
|
||||||
|
|
@ -78,19 +78,12 @@ This tool can generate template "C" code from JSON datamodel definitions.
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
$ ./tools/convert_dm_json_to_c.py
|
$ ./tools/convert_dm_json_to_c.py
|
||||||
Usage: ./tools/convert_dm_json_to_c.py <data model name> [Object path]
|
Usage: ./tools/convert_dm_json_to_c.py [Object path]
|
||||||
data model name: The data model(s) to be used, for ex: tr181 or tr181,tr104
|
|
||||||
Examples:
|
Examples:
|
||||||
- ./tools/convert_dm_json_to_c.py tr181
|
- ./tools/convert_dm_json_to_c.py
|
||||||
==> Generate the C code of tr181 data model in datamodel/ folder
|
==> Generate the C code of full data model in datamodel/ folder
|
||||||
- ./tools/convert_dm_json_to_c.py tr104
|
- ./tools/convert_dm_json_to_c.py Device.DeviceInfo.
|
||||||
==> Generate the C code of tr104 data model in datamodel/ folder
|
|
||||||
- ./tools/convert_dm_json_to_c.py tr181,tr104
|
|
||||||
==> Generate the C code of tr181 and tr104 data model in datamodel/ folder
|
|
||||||
- ./tools/convert_dm_json_to_c.py tr181 Device.DeviceInfo.
|
|
||||||
==> Generate the C code of Device.DeviceInfo object in datamodel/ folder
|
==> Generate the C code of Device.DeviceInfo object in datamodel/ folder
|
||||||
- ./tools/convert_dm_json_to_c.py tr104 Device.Services.VoiceService.{i}.Capabilities.
|
|
||||||
==> Generate the C code of Device.Services.VoiceService.{i}.Capabilities. object in datamodel/ folder
|
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -101,7 +94,7 @@ This tool helps in validating the json schema, which is very helpful in the deve
|
||||||
```bash
|
```bash
|
||||||
$ ./tools/validate_json_plugin.py test/files/etc/bbfdm/json/UserInterface.json
|
$ ./tools/validate_json_plugin.py test/files/etc/bbfdm/json/UserInterface.json
|
||||||
$ ./tools/validate_json_plugin.py test/files/etc/bbfdm/json/X_IOPSYS_EU_TEST.json
|
$ ./tools/validate_json_plugin.py test/files/etc/bbfdm/json/X_IOPSYS_EU_TEST.json
|
||||||
$ ./tools/validate_json_plugin.py dmtree/json/tr181.json
|
$ ./tools/validate_json_plugin.py dmtree/json/datamodel.json
|
||||||
```
|
```
|
||||||
|
|
||||||
More examples available in [this path](https://dev.iopsys.eu/bbf/bbfdm/-/tree/devel/test/files/etc/bbfdm/plugins).
|
More examples available in [this path](https://dev.iopsys.eu/bbf/bbfdm/-/tree/devel/test/files/etc/bbfdm/plugins).
|
||||||
|
|
@ -161,7 +154,7 @@ The parameters/keys used in tools_input.json file are mostly self-explanatory bu
|
||||||
|
|
||||||
|
|
||||||
> Note:
|
> Note:
|
||||||
> To add more description about the vendor extended DM objects/parameters, it is required to add the definition of the required/related DM objects/parameters in a json file (The json structure should follow same format as given in [tr181.json](../libbbfdm/dmtree/json/tr181.json)), The same json file need to be defined in dm_json_files list.
|
> To add more description about the vendor extended DM objects/parameters, it is required to add the definition of the required/related DM objects/parameters in a json file (The json structure should follow same format as given in [datamodel.json](../libbbfdm/dmtree/json/datamodel.json)), The same json file need to be defined in dm_json_files list.
|
||||||
|
|
||||||
|
|
||||||
The input json file should be defined as follow:
|
The input json file should be defined as follow:
|
||||||
|
|
@ -179,8 +172,7 @@ The input json file should be defined as follow:
|
||||||
"test"
|
"test"
|
||||||
],
|
],
|
||||||
"dm_json_files": [
|
"dm_json_files": [
|
||||||
"../libbbfdm/dmtree/json/tr181.json",
|
"../libbbfdm/dmtree/json/datamodel.json"
|
||||||
"../libbbfdm/dmtree/json/tr104.json"
|
|
||||||
]
|
]
|
||||||
"vendor_prefix": "X_IOPSYS_EU_",
|
"vendor_prefix": "X_IOPSYS_EU_",
|
||||||
"plugins": [
|
"plugins": [
|
||||||
|
|
@ -206,7 +198,7 @@ The input json file should be defined as follow:
|
||||||
"proto": "git",
|
"proto": "git",
|
||||||
"version": "tag/hash/branch",
|
"version": "tag/hash/branch",
|
||||||
"dm_files": [
|
"dm_files": [
|
||||||
"src/plugin/datamodel.json"
|
"src/plugin/testdm.json"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
|
|
||||||
|
|
@ -1,7 +1,7 @@
|
||||||
#!/usr/bin/python3
|
#!/usr/bin/python3
|
||||||
|
|
||||||
# Copyright (C) 2021 iopsys Software Solutions AB
|
# Copyright (C) 2024 iopsys Software Solutions AB
|
||||||
# Author: Amin Ben Ramdhane <amin.benramdhane@pivasoftware.com>
|
# Author: Amin Ben Romdhane <amin.benromdhane@iopsys.eu>
|
||||||
|
|
||||||
import sys
|
import sys
|
||||||
import os
|
import os
|
||||||
|
|
@ -13,10 +13,7 @@ import glob
|
||||||
# Constants
|
# Constants
|
||||||
BBF_ERROR_CODE = 0
|
BBF_ERROR_CODE = 0
|
||||||
CURRENT_PATH = os.getcwd()
|
CURRENT_PATH = os.getcwd()
|
||||||
BBF_DMTREE_PATH = os.path.join(CURRENT_PATH, "libbbfdm", "dmtree")
|
DM_JSON_FILE = os.path.join(CURRENT_PATH, "libbbfdm", "dmtree", "json", "datamodel.json")
|
||||||
BBF_DMTREE_PATH_TR181_JSON = os.path.join(BBF_DMTREE_PATH, "json", "tr181.json")
|
|
||||||
BBF_DMTREE_PATH_TR104_JSON = os.path.join(BBF_DMTREE_PATH, "json", "tr104.json")
|
|
||||||
ARRAY_JSON_FILES = {"tr181": BBF_DMTREE_PATH_TR181_JSON, "tr104": BBF_DMTREE_PATH_TR104_JSON}
|
|
||||||
|
|
||||||
LIST_SUPPORTED_USP_DM = []
|
LIST_SUPPORTED_USP_DM = []
|
||||||
LIST_SUPPORTED_CWMP_DM = []
|
LIST_SUPPORTED_CWMP_DM = []
|
||||||
|
|
|
||||||
|
|
@ -1,7 +1,7 @@
|
||||||
#!/usr/bin/python3
|
#!/usr/bin/python3
|
||||||
|
|
||||||
# Copyright (C) 2021 iopsys Software Solutions AB
|
# Copyright (C) 2024 iopsys Software Solutions AB
|
||||||
# Author: Amin Ben Ramdhane <amin.benramdhane@pivasoftware.com>
|
# Author: Amin Ben Romdhane <amin.benromdhane@iopsys.eu>
|
||||||
|
|
||||||
from __future__ import print_function
|
from __future__ import print_function
|
||||||
|
|
||||||
|
|
@ -657,22 +657,14 @@ def printOBJline(dmobject, value):
|
||||||
|
|
||||||
|
|
||||||
def print_dmc_usage():
|
def print_dmc_usage():
|
||||||
print("Usage: " + sys.argv[0] + " <data model name>" + " [Object path]")
|
print("Usage: " + sys.argv[0] + " [Object path]")
|
||||||
print("data model name: The data model(s) to be used, for ex: tr181 or tr181,tr104")
|
|
||||||
print("Examples:")
|
print("Examples:")
|
||||||
print(" - " + sys.argv[0] + " tr181")
|
print(" - " + sys.argv[0])
|
||||||
print(" ==> Generate the C code of tr181 data model in datamodel/ folder")
|
print(" ==> Generate the C code of full data model in datamodel/ folder")
|
||||||
print(" - " + sys.argv[0] + " tr104")
|
print(" - " + sys.argv[0] + " Device.DeviceInfo.")
|
||||||
print(" ==> Generate the C code of tr104 data model in datamodel/ folder")
|
|
||||||
print(" - " + sys.argv[0] + " tr181,tr104")
|
|
||||||
print(" ==> Generate the C code of tr181 and tr104 data model in datamodel/ folder")
|
|
||||||
print(" - " + sys.argv[0] + " tr181" + " Device.DeviceInfo.")
|
|
||||||
print(" ==> Generate the C code of Device.DeviceInfo object in datamodel/ folder")
|
print(" ==> Generate the C code of Device.DeviceInfo object in datamodel/ folder")
|
||||||
print(" - " + sys.argv[0] + " tr104" +
|
print(" - " + sys.argv[0] + " Device.Services.VoiceService.{i}.DECT.Base.{i}.")
|
||||||
" Device.Services.VoiceService.{i}.Capabilities.")
|
print(" ==> Generate the C code for a specific multi-instance object in datamodel/ folder")
|
||||||
print(
|
|
||||||
" ==> Generate the C code of Device.Services.VoiceService.{i}.Capabilities. object in datamodel/ folder")
|
|
||||||
|
|
||||||
|
|
||||||
def object_parse_childs(dmobject, value, nextlevel):
|
def object_parse_childs(dmobject, value, nextlevel):
|
||||||
hasobj = bbf.obj_has_child(value)
|
hasobj = bbf.obj_has_child(value)
|
||||||
|
|
@ -842,57 +834,44 @@ def removetmpfiles():
|
||||||
bbf.remove_file("./.events.c")
|
bbf.remove_file("./.events.c")
|
||||||
|
|
||||||
|
|
||||||
### main ###
|
def generatecfromspecificobj(passed_data, obj_to_find):
|
||||||
if len(sys.argv) < 2:
|
for _obj, _value in passed_data.items():
|
||||||
print_dmc_usage()
|
if isinstance(_value, dict) and 'type' in _value and _value['type'] == "object":
|
||||||
exit(1)
|
if _obj != obj_to_find:
|
||||||
|
generatecfromspecificobj(_value, obj_to_find)
|
||||||
|
else:
|
||||||
|
return generatecfromobj(_obj, _value, DMC_DIR, 0)
|
||||||
|
|
||||||
if (sys.argv[1]).lower() == "-h" or (sys.argv[1]).lower() == "--help":
|
|
||||||
|
### main ###
|
||||||
|
if len(sys.argv) > 1 and (sys.argv[1]).lower() in ["-h", "--help"]:
|
||||||
print_dmc_usage()
|
print_dmc_usage()
|
||||||
exit(1)
|
exit(1)
|
||||||
|
|
||||||
bbf.remove_folder(DMC_DIR)
|
bbf.remove_folder(DMC_DIR)
|
||||||
dm_name = sys.argv[1].split(",")
|
|
||||||
for index in range(sys.argv[1].count(',') + 1):
|
|
||||||
|
|
||||||
JSON_FILE = bbf.ARRAY_JSON_FILES.get(dm_name[index], None)
|
json_file = open(bbf.DM_JSON_FILE, "r", encoding='utf-8')
|
||||||
|
data = json.loads(json_file.read(), object_pairs_hook=OrderedDict)
|
||||||
|
|
||||||
if JSON_FILE is not None:
|
for dm_obj, dm_value in data.items():
|
||||||
j_file = open(JSON_FILE, "r", encoding='utf-8')
|
|
||||||
data = json.loads(j_file.read(), object_pairs_hook=OrderedDict)
|
|
||||||
|
|
||||||
for dm_obj, dm_value in data.items():
|
if dm_obj is None or not isinstance(dm_value, dict):
|
||||||
if dm_obj is None:
|
|
||||||
print("Wrong JSON Data model format!")
|
print("Wrong JSON Data model format!")
|
||||||
continue
|
exit(0)
|
||||||
|
|
||||||
# Generate the object file if it is defined by "sys.argv[2]" argument
|
# Generate the object file if it is defined by "sys.argv[1]" argument
|
||||||
if len(sys.argv) > 2:
|
if len(sys.argv) > 1 and sys.argv[1] != dm_obj:
|
||||||
if sys.argv[2] != dm_obj:
|
generatecfromspecificobj(dm_value, sys.argv[1])
|
||||||
if isinstance(dm_value, dict):
|
|
||||||
for obj1, value1 in dm_value.items():
|
|
||||||
if obj1 == sys.argv[2]:
|
|
||||||
if isinstance(value1, dict):
|
|
||||||
for obj2, value2 in value1.items():
|
|
||||||
if obj2 == "type" and value2 == "object":
|
|
||||||
generatecfromobj(
|
|
||||||
obj1, value1, DMC_DIR, 0)
|
|
||||||
break
|
|
||||||
break
|
|
||||||
break
|
break
|
||||||
|
|
||||||
# Generate the root object tree file if amin does not exist
|
# Generate the root object tree file
|
||||||
generatecfromobj(dm_obj, dm_value, DMC_DIR, 1)
|
generatecfromobj(dm_obj, dm_value, DMC_DIR, 1)
|
||||||
|
|
||||||
# Generate the sub object tree file if amin does not exist
|
# Generate the sub object tree file
|
||||||
if isinstance(dm_value, dict):
|
|
||||||
for obj1, value1 in dm_value.items():
|
for obj1, value1 in dm_value.items():
|
||||||
if isinstance(value1, dict):
|
if isinstance(value1, dict) and 'type' in value1 and value1['type'] == "object":
|
||||||
for obj2, value2 in value1.items():
|
|
||||||
if obj2 == "type" and value2 == "object":
|
|
||||||
generatecfromobj(obj1, value1, DMC_DIR, 0)
|
generatecfromobj(obj1, value1, DMC_DIR, 0)
|
||||||
else:
|
|
||||||
print("!!!! %s : Data Model doesn't exist" % dm_name[index])
|
|
||||||
|
|
||||||
if os.path.isdir(DMC_DIR):
|
if os.path.isdir(DMC_DIR):
|
||||||
print("Source code generated under \"%s\" folder" % DMC_DIR)
|
print("Source code generated under \"%s\" folder" % DMC_DIR)
|
||||||
|
|
|
||||||
File diff suppressed because it is too large
Load diff
|
|
@ -1,7 +1,7 @@
|
||||||
#!/usr/bin/python3
|
#!/usr/bin/python3
|
||||||
|
|
||||||
# Copyright (C) 2021 iopsys Software Solutions AB
|
# Copyright (C) 2024 iopsys Software Solutions AB
|
||||||
# Author: Amin Ben Ramdhane <amin.benramdhane@pivasoftware.com>
|
# Author: Amin Ben Romdhane <amin.benromdhane@iopsys.eu>
|
||||||
|
|
||||||
import sys
|
import sys
|
||||||
import json
|
import json
|
||||||
|
|
@ -114,7 +114,7 @@ else:
|
||||||
|
|
||||||
if _format == "xls":
|
if _format == "xls":
|
||||||
output_file_name = output_dir + '/' + output_file_prefix + '.xls'
|
output_file_name = output_dir + '/' + output_file_prefix + '.xls'
|
||||||
bbf_excel.generate_excel(['tr181', 'tr104'], output_file_name)
|
bbf_excel.generate_excel(output_file_name)
|
||||||
|
|
||||||
print("Datamodel generation completed, aritifacts shall be available in out directory or as per input json configuration")
|
print("Datamodel generation completed, aritifacts shall be available in out directory or as per input json configuration")
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -1,7 +1,7 @@
|
||||||
#!/usr/bin/python3
|
#!/usr/bin/python3
|
||||||
|
|
||||||
# Copyright (C) 2021 iopsys Software Solutions AB
|
# Copyright (C) 2024 iopsys Software Solutions AB
|
||||||
# Author: Amin Ben Ramdhane <amin.benramdhane@pivasoftware.com>
|
# Author: Amin Ben Romdhane <amin.benromdhane@iopsys.eu>
|
||||||
|
|
||||||
from collections import OrderedDict
|
from collections import OrderedDict
|
||||||
|
|
||||||
|
|
@ -66,39 +66,27 @@ def parse_vendor_object(list_read, list_write):
|
||||||
add_data_to_list_dm(list_write, param, "Yes", "No")
|
add_data_to_list_dm(list_write, param, "Yes", "No")
|
||||||
|
|
||||||
|
|
||||||
def load_json_data(dm_name):
|
def parse_object(list_read, list_write, proto):
|
||||||
JSON_FILE = bbf.ARRAY_JSON_FILES.get(dm_name, None)
|
with open(bbf.DM_JSON_FILE, "r", encoding='utf-8') as file:
|
||||||
if JSON_FILE is None:
|
data = json.load(file, object_pairs_hook=OrderedDict)
|
||||||
print(f"!!!! {dm_name} : Data Model doesn't exist")
|
|
||||||
return None
|
|
||||||
|
|
||||||
with open(JSON_FILE, "r", encoding='utf-8') as file:
|
|
||||||
return json.load(file, object_pairs_hook=OrderedDict)
|
|
||||||
|
|
||||||
def parse_object(dm_name_list, list_read, list_write, proto):
|
|
||||||
for dm in dm_name_list:
|
|
||||||
data = load_json_data(dm)
|
|
||||||
if data is not None:
|
if data is not None:
|
||||||
for obj, value in data.items():
|
for obj, value in data.items():
|
||||||
if obj is None:
|
if obj is None:
|
||||||
print(f'!!!! {dm} : Wrong JSON Data model format!')
|
print(f'!!!! {bbf.DM_JSON_FILE} : Wrong JSON Data model format!')
|
||||||
else:
|
else:
|
||||||
parse_standard_object(list_read, list_write, obj, value, proto)
|
parse_standard_object(list_read, list_write, obj, value, proto)
|
||||||
|
|
||||||
parse_vendor_object(list_read, list_write)
|
parse_vendor_object(list_read, list_write)
|
||||||
|
|
||||||
|
|
||||||
def parse_object_tree(dm_name_list):
|
def parse_object_tree():
|
||||||
if isinstance(dm_name_list, list) is False:
|
|
||||||
return None
|
|
||||||
|
|
||||||
# Usage for USP Data Model
|
# Usage for USP Data Model
|
||||||
LIST_SUPPORTED_USP_DM = bbf.LIST_SUPPORTED_USP_DM
|
LIST_SUPPORTED_USP_DM = bbf.LIST_SUPPORTED_USP_DM
|
||||||
parse_object(dm_name_list, LIST_SUPPORTED_USP_DM, LIST_USP_DM, "usp")
|
parse_object(LIST_SUPPORTED_USP_DM, LIST_USP_DM, "usp")
|
||||||
|
|
||||||
# Usage for CWMP Data Model
|
# Usage for CWMP Data Model
|
||||||
LIST_SUPPORTED_CWMP_DM = bbf.LIST_SUPPORTED_CWMP_DM[:]
|
LIST_SUPPORTED_CWMP_DM = bbf.LIST_SUPPORTED_CWMP_DM[:]
|
||||||
parse_object(dm_name_list, LIST_SUPPORTED_CWMP_DM, LIST_CWMP_DM, "cwmp")
|
parse_object(LIST_SUPPORTED_CWMP_DM, LIST_CWMP_DM, "cwmp")
|
||||||
|
|
||||||
def generate_excel_sheet(sheet, title, data, style_mapping):
|
def generate_excel_sheet(sheet, title, data, style_mapping):
|
||||||
style_title = style_mapping["title"]
|
style_title = style_mapping["title"]
|
||||||
|
|
@ -182,10 +170,10 @@ def generate_excel_file(output_file):
|
||||||
wb.save(output_file)
|
wb.save(output_file)
|
||||||
|
|
||||||
|
|
||||||
def generate_excel(dm_name_list, output_file="datamodel.xml"):
|
def generate_excel(output_file="datamodel.xml"):
|
||||||
print("Generating BBF Data Models in Excel format...")
|
print("Generating BBF Data Models in Excel format...")
|
||||||
|
|
||||||
parse_object_tree(dm_name_list)
|
parse_object_tree()
|
||||||
generate_excel_file(output_file)
|
generate_excel_file(output_file)
|
||||||
|
|
||||||
if os.path.isfile(output_file):
|
if os.path.isfile(output_file):
|
||||||
|
|
@ -201,14 +189,6 @@ if __name__ == '__main__':
|
||||||
epilog='Part of BBF-tools, refer Readme for more examples'
|
epilog='Part of BBF-tools, refer Readme for more examples'
|
||||||
)
|
)
|
||||||
|
|
||||||
parser.add_argument(
|
|
||||||
'-d', '--datamodel',
|
|
||||||
action = 'append',
|
|
||||||
metavar='tr181',
|
|
||||||
choices= ['tr181', 'tr104'],
|
|
||||||
required= True,
|
|
||||||
)
|
|
||||||
|
|
||||||
parser.add_argument(
|
parser.add_argument(
|
||||||
'-r', '--remote-dm',
|
'-r', '--remote-dm',
|
||||||
action='append',
|
action='append',
|
||||||
|
|
@ -253,6 +233,6 @@ if __name__ == '__main__':
|
||||||
plugins.append(r)
|
plugins.append(r)
|
||||||
|
|
||||||
bbf.generate_supported_dm(args.vendor_prefix, args.vendor_list, plugins)
|
bbf.generate_supported_dm(args.vendor_prefix, args.vendor_list, plugins)
|
||||||
generate_excel(args.datamodel, args.output)
|
generate_excel(args.output)
|
||||||
print(f'Datamodel generation completed, aritifacts available in {args.output}')
|
print(f'Datamodel generation completed, aritifacts available in {args.output}')
|
||||||
sys.exit(bbf.BBF_ERROR_CODE)
|
sys.exit(bbf.BBF_ERROR_CODE)
|
||||||
|
|
|
||||||
|
|
@ -1,7 +1,7 @@
|
||||||
#!/usr/bin/python3
|
#!/usr/bin/python3
|
||||||
|
|
||||||
# Copyright (C) 2021 iopsys Software Solutions AB
|
# Copyright (C) 2024 iopsys Software Solutions AB
|
||||||
# Author: Amin Ben Ramdhane <amin.benramdhane@pivasoftware.com>
|
# Author: Amin Ben Romdhane <amin.benromdhane@iopsys.eu>
|
||||||
|
|
||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
|
|
|
||||||
|
|
@ -9,8 +9,7 @@
|
||||||
"iopsys"
|
"iopsys"
|
||||||
],
|
],
|
||||||
"dm_json_files": [
|
"dm_json_files": [
|
||||||
"libbbfdm/dmtree/json/tr181.json",
|
"libbbfdm/dmtree/json/datamodel.json",
|
||||||
"libbbfdm/dmtree/json/tr104.json",
|
|
||||||
"libbbfdm/dmtree/vendor/iopsys/vendor.json"
|
"libbbfdm/dmtree/vendor/iopsys/vendor.json"
|
||||||
],
|
],
|
||||||
"vendor_prefix": "X_IOPSYS_EU_",
|
"vendor_prefix": "X_IOPSYS_EU_",
|
||||||
|
|
|
||||||
|
|
@ -1,7 +1,7 @@
|
||||||
#!/usr/bin/python3
|
#!/usr/bin/python3
|
||||||
|
|
||||||
# Copyright (C) 2021 iopsys Software Solutions AB
|
# Copyright (C) 2024 iopsys Software Solutions AB
|
||||||
# Author: Amin Ben Ramdhane <amin.benramdhane@pivasoftware.com>
|
# Author: Amin Ben Romdhane <amin.benromdhane@iopsys.eu>
|
||||||
|
|
||||||
import sys
|
import sys
|
||||||
import json
|
import json
|
||||||
|
|
@ -276,7 +276,7 @@ command_schema = {
|
||||||
def print_validate_json_usage():
|
def print_validate_json_usage():
|
||||||
print("Usage: " + sys.argv[0] + " <dm json file>")
|
print("Usage: " + sys.argv[0] + " <dm json file>")
|
||||||
print("Examples:")
|
print("Examples:")
|
||||||
print(" - " + sys.argv[0] + " tr181.json")
|
print(" - " + sys.argv[0] + " datamodel.json")
|
||||||
print(" ==> Validate the json file")
|
print(" ==> Validate the json file")
|
||||||
print("")
|
print("")
|
||||||
exit(1)
|
exit(1)
|
||||||
|
|
|
||||||
Loading…
Add table
Reference in a new issue