I currently ran into a requirement of having the data dynamically imported into MySQL database (But by using java).
I am not sure what is the best practice for this but let me put forward my scenario.
The json can be dynamic and its uncertain of what will be in the json each time I read from an API endpoint
for example:
"table1" : {
"field1" : "value1",
"field2" : "value2",
"field3" : "value3"
}
One time the json will be like this
the other time it might be like
"table2" : {
"field1" : "value1",
"field2" : "value2",
"field3" : "value3",
"field4" : "value4"
}
so my question is should I parse the json and compare it with the fields in table which matches the json name?
or is there any library available in java which will enable me to easily and efficiently handle this.
NOTE: I have to prepare this as a service which will continously scan a given API after certain interval and the respective JSON needs to get imported into the table it belongs, and if the table doesn't exist then it must get automatically created.
Thanks all for your patient reading and support, this is a new kind of requirement which I have never handled.
Thank you in Advance!
解决方案
In my project i used google-gson API. It worked very well and it was painless.
You can find the project with it's description here.
Below is a simple example from API docs wich supports multi-dimensional arrays, with arbitrarily complex element types:
Collections Examples
Gson gson = new Gson();
Collection ints = Lists.immutableList(1,2,3,4,5);
// Serialization
String json = gson.toJson(ints); // ==> json is [1,2,3,4,5]
// Deserialization
Type collectionType = new TypeToken>(){}.getType();
Collection ints2 = gson.fromJson(json, collectionType);
// ==> ints2 is same as ints
Hope this helps.