Coder Social home page Coder Social logo

json2flat's Introduction

Json2Flat

This library converts JSON documents to CSV.
It uses google-gson and JsonPath for conversion.
Click here for a quick evaluation.

Maven Central

Dependency for Maven.

<dependency>
    <groupId>com.github.opendevl</groupId>
    <artifactId>json2flat</artifactId>
    <version>1.0.3</version>
</dependency>

Example

String str = new String(Files.readAllBytes(Paths.get("/path/to/source/file.json")));

JFlat flatMe = new JFlat(str);

//get the 2D representation of JSON document
List<Object[]> json2csv = flatMe.json2Sheet().getJsonAsSheet();

//write the 2D representation in csv format
flatMe.write2csv("/path/to/destination/file.json");

OR

String str = new String(Files.readAllBytes(Paths.get("/path/to/source/file.json")));

JFlat flatMe = new JFlat(str);

//directly write the JSON document to CSV
flatMe.json2Sheet().write2csv("/path/to/destination/file.json");

//directly write the JSON document to CSV but with delimiter
flatMe.json2Sheet().write2csv("/path/to/destination/file.json", '|');

Input JSON

{
    "store": {
	    
	    
		"book": [
		    {
		        "name":"dasd",
		        "category": "reference",
		        "author": "Nigel Rees",
		        "title": "Sayings of the Century",
		        "price": 8.95,
		        "marks" : [3,99,89]
		    },
		    {
		        "category": "fiction",
		        "author": "Evelyn Waugh",
		        "title": "Sword of Honour",
		        "price": 12.99,
		        "marks" : [3,99,89,34,67567]
		    },
		    {
		        "category": "fiction",
		        "author": "Herman Melville",
		        "title": "Moby Dick",
		        "isbn": "0-553-21311-3",
		        "price": 8.99,
		        "marks" : [3,99,89]
		    },
		    {
		        "category": "fiction",
		        "author": "J. R. R. Tolkien",
		        "title": "The Lord of the Rings",
		        "isbn": "0-395-19395-8",
		        "price": 22.99,
		        "marks" : []
		    }
		]
	}
}

Output CSV

/store/book/name /store/book/category /store/book/author /store/book/title /store/book/price /store/book/marks/0 /store/book/marks/1 /store/book/marks/2 /store/book/marks/3 /store/book/marks/4 /store/book/isbn
dasd reference Nigel Rees Sayings of the Century 8.95 3 99 89
fiction Evelyn Waugh Sword of Honour 12.99 3 99 89 34 67567
fiction Herman Melville Moby Dick 8.99 3 99 89 0-553-21311-3
fiction J. R. R. Tolkien The Lord of the Rings 22.99 0-395-19395-8

If you want to remove the "/" from header name then use the headerSeparator() function e.g.

To change /store/book/name to store book name

flatMe.json2Sheet().headerSeparator().write2csv("/path/to/destination/file.json");

To change /store/book/name to store_book_name

flatMe.json2Sheet().headerSeparator("_").write2csv("/path/to/destination/file.json");

Click here for JavaDoc.

json2flat's People

Contributors

curious95 avatar opendevl avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

json2flat's Issues

Remove slash (/) character from header

Hi

Thanks a lot for a great tool. This jar helps me a lot to solve some on of my problems. I have one simple issue. I have a sample dataset like below which we are using to generate the csv file. The issue is that, the CSV file header becomes like
/name, /category, /author,/title,/price

I want to remove this / character from the csv file header. Please suggest, how it is possible.

[ {
"name":"dasd",
"category": "reference",
"author": "Nigel Rees",
"title": "Sayings of the Century",
"price": 8.95

},

{
"category": "fiction",
"author": "Evelyn Waugh",
"title": "Sword of Honour",
"price": 12.99

	    }

]

Thanks a lot.
Regards,
Prakash

Extra row generated when primitive value follows array in input json

JFlat produces an extra row when a primitive element follows a json containing an array.
It works properly if the primitive element is moved before the array.
The input json that produces error is below. If the 'expired' element is moved before the array, output is correct.

{
  "number": "334455",
  "customerInfo": {
       "service": [
        {
          "customer": "Verizon"
        },
        {
          "customer": "Rogers"
        },
        {
          "customer": "Fido"
        }
      ]
  },
  "expired": true
}

Screen shots are attached.
jflat-extra-row
jflat-ok

When parsing int values in JSON, the resulting csv has them as double

Take the JSON in the Readme.md for example:
...
{
"name":"dasd",
"category": "reference",
"author": "Nigel Rees",
"title": "Sayings of the Century",
"price": 8.95,
"marks" : [3,99,89]
}
...

When the result for "marks", which is originally of type int, is parsed and examined in a text editor, it looks like this:
3.0,99.0,89.0

Can there be a fix for this, or are all ints going to be defaulted to doubles?

Checked out master version differs from maven repo version

Hi,

I checked out master version code (1.0.3).
It contains following method :
com.github.opendevl.JFlat.write2csv(char)
But when I refer the maven repo JAR for same version (1.0.3), it doesn't contains this method.
I need this method.

Thanks,
Dhananjay

json to CSV conversion is failing

Dear Sir

I am trying to convert a json data getting from a REST URL into URL. The issue which I am facing is that, first time when I am calling my program(fresh execution), csv file is generating successfully. On the same session, if I call the method of generate csv file from the json data, it is throwing the below error:

com.google.gson.JsonSyntaxException: com.google.gson.stream.MalformedJsonException: Unterminated object at line 1 column 420 path $[8].desig
at com.google.gson.internal.Streams.parse(Streams.java:60)
at com.google.gson.JsonParser.parse(JsonParser.java:84)
at com.google.gson.JsonParser.parse(JsonParser.java:59)
at com.google.gson.JsonParser.parse(JsonParser.java:45)
at com.github.opendevl.JFlat.json2Sheet(JFlat.java:186)
at com.github.opendevl.cdvprocess.TSheetOperation.transformJsonToCSVAndSaveInHDFS(TSheetOperation.java:39)
at org.ciscocdv.javarest.resources.CDVDataIntegrationSaveResources.getSavedResourceInHDFS(CDVDataIntegrationSaveResources.java:36)
at org.ciscocdv.javarest.restcdvdlppkg.CDVWebServiceOperations.saveDataIntegrationOutputInHDFS(CDVWebServiceOperations.java:44)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:731)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.catalina.filters.CorsFilter.handleSimpleCORS(CorsFilter.java:301)
at org.apache.catalina.filters.CorsFilter.doFilter(CorsFilter.java:169)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:218)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:505)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:169)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:956)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:442)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1082)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:623)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
Caused by: com.google.gson.stream.MalformedJsonException: Unterminated object at line 1 column 420 path $[8].desig
at com.google.gson.stream.JsonReader.syntaxError(JsonReader.java:1559)
at com.google.gson.stream.JsonReader.doPeek(JsonReader.java:491)
at com.google.gson.stream.JsonReader.hasNext(JsonReader.java:414)
at com.google.gson.internal.bind.TypeAdapters$29.read(TypeAdapters.java:738)
at com.google.gson.internal.bind.TypeAdapters$29.read(TypeAdapters.java:731)
at com.google.gson.internal.bind.TypeAdapters$29.read(TypeAdapters.java:714)
at com.google.gson.internal.Streams.parse(Streams.java:48)
... 51 more

I am running behind of time to solve this issue. Please suggest some way to solve this.
Thanks in advance.

Column Reordering

It seems like the column orders do not follow the json data

for example:
JSON DATA
[ {"e":1, "b":"s", "g":123, "d":"test"}, {"e":4, "b":"ssd", "g":321, "d":"haha"}, {"e":2, "b":"42dd", "g":455, "d":"testing"} ]

after json2Sheet(), it becomes
d, g, e, b
test, 123, 1, s
haha, 321, 4, ssd
testing, 455, 2, 42dd

expected output is:
e, b, g, d
1, s, 123, test
4, ssd, 321, haha
2, 42dd, 455, testing

formart result

nice util but i find a problem double quotation mark when i jsonAsSheet original file do not contain double quotation mark

Dingtalk_20210120180728

Order is changed

I am parsing multiple files and some have null values. The order seems to be changed. Can you please suggest.

Only first set of object is getting parsed

Let say i have a JSON object

[
{
"defaultaddress": "6789 Amphitheatre vamshi Mountain View",
"isinactive": false,
"isprivate": false,
"custentity_celigo_calendar_enabled": false,
"firstname": "Elizabeth",
"resubscribelink": "Send Subscription Email",
"lastmodifieddate": "12/15/2016 10:04 pm",
"custentity_celigo_gcontact_do_not_sync": false,
"entityid": "Elizabeth Bennet updated",
"datecreated": "12/15/2016 2:31 am",
"globalsubscriptionstatus": {
"internalid": "2",
"name": "Soft Opt-Out"
},
"subsidiary": {
"internalid": "1",
"name": "Parent Company"
},
"lastname": "Bennet",
"addressbook": [
{
"addressbookaddress_text": "6789 Amphitheatre vamshi Mountain View",
"defaultbilling": true,
"defaultshipping": false,
"id": 7508,
"country": {
"internalid": "US",
"name": "United States"
},
"override": true,
"addrtext": "6789 Amphitheatre vamshi Mountain View"
}
],
"customform": {
"internalid": "-40",
"name": "Standard Contact Form"
},
"phone": "(206)888-1212",
"recordtype": "contact",
"id": "54278",
"officephone": "(206)888-1212",
"email": "[email protected]",
"usernotes": [
{
"internalid": "",
"externalid": "",
"notedate": " ",
"title": "",
"note": "",
"direction": "",
"notetype": ""
}
],
"_billing_addressbook": {
"addressbookaddress_text": "6789 Amphitheatre vamshi Mountain View",
"defaultbilling": true,
"defaultshipping": false,
"id": 7508,
"country": {
"internalid": "US",
"name": "United States"
},
"override": true,
"addrtext": "6789 Amphitheatre vamshi Mountain View"
}
},
{
"isinactive": false,
"isprivate": false,
"custentity_celigo_calendar_enabled": false,
"firstname": "contact",
"resubscribelink": "Send Subscription Email",
"lastmodifieddate": "12/7/2016 6:34 am",
"custentity_celigo_gcontact_do_not_sync": false,
"entityid": "frm ns contact",
"datecreated": "12/7/2016 6:34 am",
"globalsubscriptionstatus": {
"internalid": "2",
"name": "Soft Opt-Out"
},
"subsidiary": {
"internalid": "1",
"name": "Parent Company"
},
"lastname": "ns",
"customform": {
"internalid": "-40",
"name": "Standard Contact Form"
},
"recordtype": "contact",
"id": "52075",
"email": "[email protected]",
"usernotes": [
{
"internalid": "",
"externalid": "",
"notedate": " ",
"title": "",
"note": "",
"direction": "",
"notetype": ""
}
]
}
]

only starting data inside {} is getting formatted.

Can you please Help me out

Doesn't work if one Json record per line

com.google.gson.JsonSyntaxException: com.google.gson.stream.MalformedJsonException: Use JsonReader.setLenient(true) to accept malformed JSON at line 2 column 2 path $
at com.google.gson.JsonParser.parse(JsonParser.java:65)
at com.google.gson.JsonParser.parse(JsonParser.java:45)
at com.github.opendevl.JFlat.json2Sheet(JFlat.java:153)
at com.cognizant.ddhkafka.App.main(App.java:167)
Caused by: com.google.gson.stream.MalformedJsonException: Use JsonReader.setLenient(true) to accept malformed JSON at line 2 column 2 path $
at com.google.gson.stream.JsonReader.syntaxError(JsonReader.java:1559)
at com.google.gson.stream.JsonReader.checkLenient(JsonReader.java:1401)
at com.google.gson.stream.JsonReader.doPeek(JsonReader.java:542)
at com.google.gson.stream.JsonReader.peek(JsonReader.java:425)
at com.google.gson.JsonParser.parse(JsonParser.java:60)
... 3 more

Input:
{"id":503,"name":"raju5","age":37,"last_updt":1519868433000}

But the below works fine,
{
"id":503,
"name":"raju5",
"age":37,
"last_updt":1519868433000
}

How to Handle Arabic characters in csv

Hi team,

JSON having values as Arabic characters are getting displayed as symbols PFA screenshot
arabic_csv

sample json:
{
"data": [{
"key1": "مرحبا",
"key2": "أهلا"
}]
}
code:
JFlat flatMe = new JFlat(str); // here str is StandardCharsets.UTF_8 encoded

//directly write the JSON document to CSV
flatMe.json2Sheet().write2csv("/path/to/destination/file.csv");

Running this code on HDFS

I am trying to convert a json file which is present on HDFS to CSV. Below two main classes I have tried for this:

1-
public class ClassMain {

public static void main(String[] args) throws IOException {
            String uri = args[1];
	String uri1 = args[2];
	
	Configuration conf = new Configuration();
	conf.set("fs.defaultFS", "hdfs:///ip-10-16-37-124:9000/");
	
	String str = new String(Files.readAllBytes(Paths.get(uri)));
	
	//FileSystem fs = FileSystem.get(URI.create("hdfs:///"+uri), conf);
	
	FileSystem fs = FileSystem.get(conf);
	FSDataInputStream in = null;
	FSDataOutputStream out = fs.create(new Path(uri1));
	try{
		
		in = fs.open(new Path(uri));
		JsonToCSV toCSV = new JsonToCSV(str);
		toCSV.json2Sheet().write2csv(uri1);
		IOUtils.copyBytes(in, out, 4096, false);
		
	}
	finally{
		IOUtils.closeStream(in);
		IOUtils.closeStream(out);
	}
}
}

I am running the jar as:
hadoop jar json-csv-hdfs.jar com.nishant.ClassMain /nishant/small.json /nishant/small.csv

But somehow, it reads as URI of my input file as hdfs:/ip-10-16-37-124:9000/nishant/small.json which is incorrect. I have tried all possible combinations of the URI, but it does not change the formed URI. I thought maybe Files.readAllBytes is causing problem while reading the JSON file, so I tried using buffer reader to read input file. Below is second main class I wrote for this.

2-
public class ClassMain {

public static void main(String[] args) throws IOException {
	Configuration conf = new Configuration();
	FileSystem fs = FileSystem.get(conf);
	Path inFile = new Path(args[1]);
	Path outFile = new Path(args[2]);

	if (!fs.exists(inFile))
	  System.out.println("Input file not found");
	
	if (!fs.isFile(inFile))
	  System.out.println("Input should be a file");
	if (fs.exists(outFile))
	  System.out.println("Output already exists");

	FSDataInputStream in = fs.open(inFile);
	FSDataOutputStream out = fs.create(outFile);
	byte buffer[] = new byte[12000];
	try{
		int bytesRead = 0;
	while ((bytesRead = in.read(buffer)) > 0) {
		String str = new String(buffer);
		JsonToCSV toCSV = new JsonToCSV(str);
		toCSV.json2Sheet().write2csv(outFile.toString());
	  out.write(buffer, 0, bytesRead);
	}
	}catch (IOException e) {
		         System.out.println("Error while copying file");}
	finally{in.close();
	out.close();
}
}	

}

Here, it is reading the input files alright. I tried printing the value of this.jsonString from JsonFlat.java and it was able to read the valid JSON. But the method call -
ele = new JsonParser().parse(this.jsonString); is not going proper and it gives below stack trace:

Exception in thread "main" com.google.gson.JsonSyntaxException: com.google.gson.stream.MalformedJsonException: Use JsonReader.setLenient(true) to accept malformed JSON at line 4 column 2
at com.google.gson.JsonParser.parse(JsonParser.java:65)
at com.google.gson.JsonParser.parse(JsonParser.java:45)
at com.nishant.JsonToCSV.json2Sheet(JsonToCSV.java:105)
at com.nishant.ClassMain.main(ClassMain.java:45)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: com.google.gson.stream.MalformedJsonException: Use JsonReader.setLenient(true) to accept malformed JSON at line 4 column 2
at com.google.gson.stream.JsonReader.syntaxError(JsonReader.java:1505)
at com.google.gson.stream.JsonReader.checkLenient(JsonReader.java:1386)
at com.google.gson.stream.JsonReader.doPeek(JsonReader.java:531)
at com.google.gson.stream.JsonReader.peek(JsonReader.java:414)
at com.google.gson.JsonParser.parse(JsonParser.java:60)
... 9 more

To call the parse method, it is sending this.jsonString as parameter which is a valid JSON as I have printed, then why is this giving malformed JSON exception.

Is it because of "public JsonElement parse(Reader json) throws JsonIOException, JsonSyntaxException {" Reader data type?

How to run this code in HDFS which would solve my problem?

value of this.jsonString which is a valid JSON:

[
{"uploadTimeStamp":"1488793033624","PDID":"123","data":[{"Data":{"unit":"rpm","value":"100"},"EventID":"E1","PDID":"123","Timestamp":1488793033624,"Timezone":330,"Version":"1.0","pii":{}},{"Data":{"heading":"N","loc1":"false","loc2":"00.001","loc3":"00.004","loc4":"false","speed":"10"},"EventID":"E2","PDID":"123","Timestamp":1488793033624,"Timezone":330,"Version":"1.1","pii":{}},{"Data":{"xvalue":"1.1","yvalue":"1.2","zvalue":"2.2"},"EventID":"E3","PDID":"123","Timestamp":1488793033624,"Timezone":330,"Version":"1.0","pii":{}},{"EventID":"E4","Data":{"value":"50","unit":"percentage"},"Version":"1.0","Timestamp":1488793033624,"PDID":"123","Timezone":330},{"Data":{"unit":"kmph","value":"70"},"EventID":"E5","PDID":"123","Timestamp":1488793033624,"Timezone":330,"Version":"1.0","pii":{}}]},
{"uploadTimeStamp":"1488793167598","PDID":"124","data":[{"Data":{"unit":"rpm","value":"100"},"EventID":"E1","PDID":"124","Timestamp":1488793167598,"Timezone":330,"Version":"1.0","pii":{}},{"Data":{"heading":"N","loc1":"false","loc2":"00.001","loc3":"00.004","loc4":"false","speed":"10"},"EventID":"E2","PDID":"124","Timestamp":1488793167598,"Timezone":330,"Version":"1.1","pii":{}},{"Data":{"xvalue":"1.1","yvalue":"1.2","zvalue":"2.2"},"EventID":"E3","PDID":"124","Timestamp":1488793167598,"Timezone":330,"Version":"1.0","pii":{}},{"EventID":"E4","Data":{"value":"50","unit":"percentage"},"Version":"1.0","Timestamp":1488793167598,"PDID":"124","Timezone":330},{"Data":{"unit":"kmph","value":"70"},"EventID":"E5","PDID":"124","Timestamp":1488793167598,"Timezone":330,"Version":"1.0","pii":{}}]}]

Ignoring some fields

Hi, I have a json object and would like to ignore parsing of some fields.
Let's consider this json:
[ {"some":"text", "list":[1, 2, 3, 4, 5]}, {"some":"otherText", "list":[12, 93, 1039, 99491, 33]} ]

how could I ignore list when parsing this json doc?

Release 1.0.4

I found your library hoping to convert my String json to a String csv. I noticed you haven't released the library again to Maven since writing the method to do this. Can you please release a version 1.0.4?

Support writing to more than just the local filesystem

Instead of offering the api

public void write2csv(String destination)

It would be nice to offer something like

public void write2csv(Writer destination)

That way, this library can do more than simply writing to a local filesystem. It would let consumers write the csv to an in memory string, for example.

Two arrays in JSON , one array is ignored when converted to CSV

Please find the below code and respective json and csv files. Second array - applied_aspects(outside suggestions array) is not read correctly.

Files.zip

package JsonToCsv;

import com.github.opendevl.JFlat;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.List;

public class JSONToCSV {
public static void main(String args[]){
try {
String str = new String(Files.readAllBytes(Paths.get("C:/Users/ruchiwadhwa/Desktop/updated-JSON.json")));

        `JFlat flatMe = new JFlat(str);`

        `List<Object[]> json2csv = flatMe.json2Sheet().getJsonAsSheet();`

        `flatMe.write2csv("C:/Users/ruchiwadhwa/Desktop/updated-CSV.csv");`
    `}`
    `catch(Exception e){`
        `System.out.println(e.getMessage());`
    `}`
`}`

}

Please help, it is quite urgent.

escape comma in json value while writing to csv file

escape comma in json value while writing to csv file,
sample json: here for 1st object 3 columns are there due to , in key1 value
[{
"key1": "hi, i am fine",
"key2": "2"
},
{
"key1": 1,
"key2": 2
}]
PFA screenshot which shows the problem
jflat-issue

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.