Coder Social home page Coder Social logo

json-data-generator's Introduction

Json Data Generator

Have you ever needed to generate a realtime stream of json data in order to test an application? When thinking about a good source of streaming data, we often look to the Twitter stream as a solution, but that only gets us so far in prototyping scenarios and we often fall short because Twitter data only fits a certain amount of use cases. There are plenty of json data generator online (like json-generator, or mockaroo), but we couldn't find an offline data generator for us to use in our testing and prototyping, so we decided to build one. We found it so useful, that we decided to open source it as well so other can make use of it in their own projects.

For more information, check out the announcement blog post.

Project Status

Build Status

Features

We had a couple of needs when it came to generating data for testing purposes. They were as follows:

  • Generate json documents that are defined in json themselves. This would allow us to take existing schemas, drop them in to the generator, modify them a bit and start generating data that looks like what we expect in our application
  • Generate json with random data as values. This includes different types of random data, not just random characters, but things like random names, counters, dates, primitive types, etc.
  • Generate a constant stream of json events that are sent somewhere. We might need to send the data to a log file or to a Kafka topic or something else.
  • Generate events in a defined order, at defined or random time periods in order to act like a real system.

We now have a data generator that supports all of these things that can be run on our own networks and produce streams of json data for applications to consume.

License

Apache License, Version 2.0

Architecture

The Generator has the following basic architecture:

  • JsonDataGenerator - The main application that loads configurations and runs simulations.
  • Simulation Configuration - A json file that represents your over all simulation you would like to run.
  • Workflow Definitions - Json files that define a workflow that is run by your Simulation.

When you start the JsonDataGenerator, you specify your Simulation Configuration which also references one or many Workflow Definitions. The Simulation is loaded and the Workflows are created within the application and each workflow is started within its own thread. Json events are generated and sent to the defined Producers.

What's defined Where

There are multiple pieces of information that you as a developer/integrator need to define within your configuration files. We'll break down what goes where in this section.

Simulation Configuration

The Simulation Configuration defines the following:

Property Type Description
workflows Array Defines a list of workflows to run
producers Array Defines a list of producers that events should be sent to

A Workflow is defined with the following properties:

Property Type Description
workflowName String A name for the workflow
workflowFilename String The filename of the workflow config file to run
instances Integer the number of identical copies of instances to run (optional, default value is 1)
customTypeHandlers (optional) Array A list of packages where custom type handlers are defined

Here is an example of a Workflow configuration:

"workflows": [{
            "workflowName": "test",
            "workflowFilename": "exampleWorkflow.json",
            "instances": 1,
            "customTypeHandlers": ["com.example.types", "org.example2.types"]
        }]

A Producer is defined with the following properties:

Property Type Description
type String The type of the producer
optional properties String Other properties are added to the producer config based on the type of Producer

Currenty supported Producers and their configuration properties are:

Logger

A Logger Producer sends json events to a log file in the logs directory into a file called json-data.log. The logs roll based on time and size. Configure it like so:

{
    "type": "logger"
}

No configuration options exist for the Logger Producer at this time.

File

A File Producer sends json events to a file. One event will be written to each file in the specified directory. Configure it like so:

{
    "type": "file",
    "output.directory": "/tmp/dropbox/test2",
    "file.prefix": "MYPREFIX_",
    "file.extension":".json"
}

The File Logger will attempt to create the directory specified by output.directory.

HTTP POST

A HTTP POST Producer sends json events to a URL as the Request Body. Configure it like so:

{
    "type": "http-post",
    "url": "http://localhost:8050/ingest"
}

If you need to send data to an HTTPS endpoint that required client certificates, you can provide the configuration of those certificates on the commandline using the javax.net.ssl.* properties. An example might be:

java -Djavax.net.ssl.trustStore=/path/to/trustsore.jks -Djavax.net.ssl.keyStore=/path/to/user/cert/mycert.p12 -Djavax.net.ssl.keyStoreType=PKCS12 -Djavax.net.ssl.keyStorePassword=password -jar json-data-generator-1.2.2-SNAPSHOT.jar mySimConfig.json

Kafka

A Kafka Producer sends json events to the specified Kafka broker and topic as a String. Configure it like so:

{
    "type": "kafka",
    "broker.server": "192.168.59.103",
    "broker.port": 9092,
    "topic": "logevent",
    "flatten": false,
    "sync": false
}

If sync=false, all events are sent asynchronously. If for some reason you would like to send events synchronously, set sync=true. If flatten=true, the Producer will flatten the json before sending it to the Kafka queue, meaning instead of having nested json objects, you will have all properties at the top level using dot notation like so:

{
    "test": {
    	"one": "one",
    	"two": "two"
   	},
   	"array": [
   		{
   			"element1": "one"
   		},{
   			"element2": "two"
   		}]
}

Would become

{
	"test.one": "one",
	"test.two": "two",
	"array[0].element1": "one",
	"array[1].element2": "two"
}

Kinesis

A Kinesis Producer sends json events to the specified Kinesis stream as a String. Configure it like so:

{
    "type": "kinesis",
    "stream": "data-input-stream",
    "region": ap-southeast-2,
    "max.records": 1000,
    "roleARN": "arn:aws:iam::XXXXXX2342:role/Kinesis-Access-Role"
}

By default, it will use DefaultAWSCredentialsProviderChain for the credentials. If you want to access the streams using cross-account, use "roleARN" with the role.

Streams can to configured differently for every step using the producerConfig config.

{
  "eventFrequency": 100,
  ...
  "steps": [
    {
      "config": [
        ...
      ],
      "producerConfig": {
        "stream": "new-stream"
      }
    }
  ]
}

Kafka on kerberized cluster

Valid only for kafka version 0.10.2.0 or higher.

If you need to send data to kafka with kerberos you can provide the next configuration:

{
    "type": "kafka",
    "topic": "logevent",
    "flatten": false,
    "sync": false,
    "kerberos": {
      "kerberos.conf.file": "<path_your_kerberos_file_krb5.conf>",
      "kafka.brokers.servers": "broker1:port,broker2:port,broker3:port",
      "kafka.jaas.file": "<path_your_jaas_file_kafka_jaas.conf",
      "kafka.security.protocol": "SASL_PLAINTEXT",
      "kafka.service.name": "kafka"
  }
}
  • kerberos.conf.file : Sets the system property 'java.security.krb5.conf'
  • kafka.jaas.file : Sets the system property 'java.security.auth.login.config'
  • kafka.security.protocol : Sets kafka producer property 'security.protocol'
  • kafka.service.name: Sets kafka producer property 'sasl.kerberos.service.name'

In a kerberized kafka cluster, the keytab file must be defined inside JAAS file "kafka.jaas.file": "<path_your_jaas_file_kafka_jaas.conf"

NATS

A nats logger sends json events to gnatsd broker specifed in the config. The following example shows a sample config that sends json events to a locally running NATS broker listening on the default NATS port.

{
    "type": "nats",
    "broker.server": "127.0.0.1",
    "broker.port": 4222,
    "topic": "logevent",
    "flatten": false
}

Note that flatten has the same effect as the option in the Kafka producer.

Tranquility

A Tranquility Producer sends json events using Tranquility to a Druid cluster. Druid is an Open Source Analytics data store and Tranquility is a realtime communication transport that Druid supports for ingesting data. Configure it like so:

{
    "type": "tranquility",
    "zookeeper.host": "localhost",
    "zookeeper.port": 2181,
    "overlord.name":"overlord",
    "firehose.pattern":"druid:firehose:%s",
    "discovery.path":"/druid/discovery",
    "datasource.name":"test",
    "timestamp.name":"startTime",
    "sync": true,
    "geo.dimensions": "where.latitude,where.longitude"
}

When sending data to Druid via Tranquility, we are sending a Task to the Druid Overlord node that will perform realtime ingestion. Our task implementation currently does not allow users to specifc the aggregators that are used. We create a single "events" metric that is a Count Aggregation. Everything else if configured though the properties. The sync and flatten properties behave exactly the same ast the Kafka Producer. This works for us currently, but if you need more capability, please file an issue or submit a pull request!

When you start the Generator, it will contact the Druid Overlord and create a task for your datasource and will then begin streaming data into Druid.

MQTT

An MQTT Producer sends json events to the MQTT broker specified in the config. The following example shows a sample config that sends json events to a locally running MQTT broker listening on the default MQTT port. The example also includes the two optional fields: username and password

{
    "type": "mqtt",
    "broker.server": "tcp://localhost",
    "broker.port": 1883,
    "topic": "/logevent",
    "clientId": "LogEvent",
    "qos": 2,
    "username": "whoami",
    "password": "whatsmypassword"
}

The MQTT producer support step specific configuration for QOS and Topic. The entire configuration and each item in it are optional. Add an "mqtt" item to the "producerConfig" map:

"mqtt" : {
    "topic": "/elsewhere",
    "qos": 1
}

Azure IoT Hub

An Azure IoT Hub Producer sends json events to the Azure IoT Hub specified in the config. Choose a protocol from HTTPS, AMQPS or MQTT

{
    "type": "iothub",
    "connectionString": "<- Get from Azure portal or Device Explorer ->",
    "protocol": "HTTPS",
}

Full Simulation Config Example

Here is a full example of a Simulation Configuration file:

exampleSimConfig.json:
{
    "workflows": [{
            "workflowName": "test",
            "workflowFilename": "exampleWorkflow.json",
            "instances" : 4
        }],
    "producers": [{
            "type": "kafka",
            "broker.server": "192.168.59.103",
            "broker.port": 9092,
            "topic": "logevent",
            "sync": false
    },{
        "type":"logger"
    }]
}

This simulation will run four instances (same workflow rules but on different threads) of Workflow and send all the events to both the defined Kafka topic and to the Logger producer that put the events in to a log file. Use instances to create multiple traffic of the same workflow.

Workflow Definition

The Workflow is defined in seperate files to allow you to have and run multiple Workflows within your Simulation. A Workflow contains the following properties:

Property Type Description
eventFrequency integer The time in milliseconds events between steps should be output at
varyEventFrequency boolean If true, a random amount (between 0 and half the eventFrequency) of time will be added/subtracted to the eventFrequency
repeatWorkflow boolean If true, the workflow will repeat after it finishes
iterations integer The number of times that the workflow will repeat. repeatWorkflow must be set to true. Defaults to -1 (no limit).
timeBetweenRepeat integer The time in milliseconds to wait before the Workflow is restarted
varyRepeatFrequency boolean If true, a random amount (between 0 and half the eventFrequency) of time will be added/subtracted to the timeBewteenRepeat
stepRunMode string Possible values: sequential, random, random-pick-one. Default is sequential
steps Array A list of Workflow Steps to be run in this Workflow

Workflow Steps

Workflow Steps are meat of your generated json events. They specify what your json will look like and how they will be generated. Depending on the stepRunMode you have chosen, the Generator will execute your steps in different orders. The possibilities are as follows:

  • sequential - Steps will be run in the order they are specified in the array.
  • random - Steps will be shuffled and run in a random order. Steps will be reshuffled each time the workflow is repeated.
  • random-pick-one - A random step will be chosen from your config and will be run. No other steps will be run until the workflow repeats. When the workflow repeats, a different random step will be picked.

Step

Now that you know how Steps are executed, let's take a look at how they are defined.

Property Type Description
config array of objects The json objects to be generated during this step
duration integer If 0, this step will run once. If -1, this step will run forever. Any other number is the time in milliseconds to run this step for. Default is 0.
iterations integer The number of times to repeat the step. If duration is -1, 0, or unset, and iterations is set, only iterations will be used. If duration is positive and iterations is set, the step will repeat until the end of the duration or until all iterations happen. Which ever happens first.
producerConfig map of objects Optional: producer configuration for this step - optional and specific for each producer. (See producer documentation)

Step Config

The configuration specified in the step config section is the actual json you want output. For example, you could put the following json as the config for a step in your workflow:

{
    "test": "this is a test",
    "test2": "this is another test",
    "still-a-test": true
}

Specifying this as the step config would literly generate that json object every time the step ran. Now, that's interesting, but just echoing json isn't really what we are after. Our Generator support a number of Functions that can be placed into your json as the value. These Functions will be run everytime the json is generated. Here is an example:

{
    "timestamp": "nowTimestamp()",
    "system": "random('BADGE', 'AUDIT', 'WEB')",
    "actor": "bob",
    "action": "random('ENTER','LOGIN','EXIT', 'LOGOUT')",
    "objects": ["Building 1"],
    "location": {
    	"lat": "double(-90.0, 90.0)",
    	"lon": "double(-180.0, 180.0)"
    },
    "message": "Entered Building 1"
}

This config will generate json documents that look something like:

{"timestamp":1430244584899,"system":"BADGE","actor":"bob","action":"LOGOUT","objects":["Building 1"],"location":{"lat":-19.4224,"lon":-165.0512},"message":"Entered Building 1"}

Each time, the timestamp, system, action, and lat/lon values would be randomly generated.

Full Workflow Definition Config Example

Here is a full example of a Workflow Definition file:

exampleWorkflow.json:
{
    "eventFrequency": 4000,
    "varyEventFrequency": true,
    "repeatWorkflow": true,
    "iterations": 5,
    "timeBetweenRepeat": 15000,
    "varyRepeatFrequency": true,
    "steps": [{
        "config": [{
    		"timestamp": "nowTimestamp()",
		    "system": "random('BADGE', 'AUDIT', 'WEB')",
		    "actor": "bob",
		    "action": "random('ENTER','LOGIN','EXIT', 'LOGOUT')",
		    "objects": ["Building 1"],
		    "location": {
		    	"lat": "double(-90.0, 90.0)",
		    	"lon": "double(-180.0, 180.0)"
		    },
		    "message": "Entered Building 1"
		}],
        "producerConfig": {
        },
        "duration": 0
    },{
        "config": [{
    		"timestamp": "nowTimestamp()",
		    "system": "random('BADGE', 'AUDIT', 'WEB')",
		    "actor": "jeff",
		    "action": "random('ENTER','LOGIN','EXIT', 'LOGOUT')",
		    "objects": ["Building 2"],
		    "location": {
		    	"lat": "double(-90.0, 90.0)",
		    	"lon": "double(-180.0, 180.0)"
		    },
		    "message": "Entered Building 2"
		}],
        "iterations": 2
    }]
}

This workflow would output the defined json about every 4 seconds and then it will wait about 15 seconds before starting again.

Running The Generator

Now that you know how to configure the Generator, it's time to run it. You will need Maven to build the application until we put a release up to download.

First, clone/fork the repo:

git clone [email protected]:acesinc/json-data-generator.git
cd json-data-generator

Now build it!

mvn clean package

Once this completes, a tar file will have been generated for you to use. Unpack the tar file somewhere you want to run the application from:

cp target/json-data-generator-1.0.0-bin.tar your/directory
cd your/directory
tar xvf json-data-generator-1.0.0-bin.tar
cd json-data-generator

In the conf directory, you will find an example Simulation Configuration and an example Workflow Definition. You can change these or make your own configurations to run with, but for now, we will just run the examples. This example simulates an auditing system that generates events when a user performs an action on a system. To do so, do the following:

java -jar json-data-generator-1.0.0.jar exampleSimConfig.json

You will start seeing output in your console and data will begin to be generated. It will look something like this:

...
2015-04-28 14:21:08,013 DEBUG n.a.d.j.g.t.TypeHandlerFactory [Thread-2] Discovered TypeHandler [ integer,net.acesinc.data.json.generator.types.IntegerType ]
2015-04-28 14:21:08,013 DEBUG n.a.d.j.g.t.TypeHandlerFactory [Thread-2] Discovered TypeHandler [ timestamp,net.acesinc.data.json.generator.types.TimestampType ]
2015-04-28 14:30:02,817 INFO data-logger [Thread-2] {"timestamp":1430253002793,"system":"BADGE","actor":"bob","action":"ENTER","objects":["Building 1"],"location":"45.5,44.3","message":"Entered Building 1"}
2015-04-28 14:30:05,369 INFO data-logger [Thread-2] {"timestamp":1430253005368,"system":"AD","actor":"bob","action":"LOGIN","objects":["workstation1"],"location":null,"message":"Logged in to workstation 1"}
2015-04-28 14:30:07,491 INFO data-logger [Thread-2] {"timestamp":1430253007481,"system":"AUDIT","actor":"bob","action":"COPY","objects":["/data/file1.txt","/share/mystuff/file2.txt"],"location":null,"message":"Printed /data/file1.txt"}
2015-04-28 14:30:09,768 INFO data-logger [Thread-2] {"timestamp":1430253009767,"system":"AUDIT","actor":"bob","action":"COPY","objects":["/data/file1.txt","/share/mystuff/file2.txt"],"location":null,"message":"Printed /data/file1.txt"}

This example only outputs to the Logger Producer, so all the data is going to your console and it is also being written to json-data-generator/logs/json-data.log. The data written to json-data.log is a single event per line and does not contain the logging timestamps and other info like above.

If you were to create your own simulation config called mySimConfig.json, you would place that file and any workflow conigs into the conf directory and run the application again like so:

java -jar json-data-generator-1.0.0.jar mySimConfig.json

Supported Functions

As you saw in the Workflow Definition, our Generator supports a number of different functions that allow you to generate random data for the values in your json documents. Below is a list of all the currently supported Functions and the arguments they support/require.

Literals

If you specify a literal value as the value of a property, it will just be echoed back into the generated json. For example:

{
	"literal-bool": true
}

Will always generate:

{
    "literal-bool": true
}

String Functions

Function Arguments Description
alpha(#) The number of characters to generate Generates a random string of Alphabetic charaters with the length specified
alphaNumeric(#) The number of characters to generate Generates a random string of AlphaNumeric charaters with the length specified
firstName() n/a Generates a random first name from a predefined list of names
lastName() n/a Generates a random last name from a predefined list of names
uuid() n/a Generates a random UUID
stringMerge() Delimiter and then the string values to merge Takes the input arguments and merges them together using the delimiter. This can be used with the this.prop or cur.prop keywords to merge generated values like: stringMerge(_, this.firstName, this.lastName) which would output something like Bilbo_Baggins

Primitive Functions

Function Arguments Description
boolean() n/a Random true/false
double([min, max]) Optional min val or range If no args, generates a random double between Double.MIN & MAX. If one arg, generates a double between that min and Double.MAX. If two args, generates a double between those two numbers.
integer([min, max]) Optional min val or range If no args, generates a random integer between Integer.MIN & MAX. If one arg, generates a integer between that min and Integer.MAX. If two args, generates a integer between those two numbers.
long([min, max]) Optional min val or range If no args, generates a random long between Long.MIN & MAX. If one arg, generates a long between that min and Long.MAX. If two args, generates a long between those two numbers.

Date Functions

Function Arguments Description
date([yyyy/MM/ddTHH:mm:ss,yyyy/MM/ddTHH:mm:ss]) Optional min date or date range Generates a random date between the min and max specified. If no args, generates a random date before today. If one arg, generates a date after the specified min. If two args, generates a date between those dates.
now([#_unit]) Optional number and unit to add to the now date If no args, generates the date of now. If one arg, it will use the # portion as the amount of time and the unit portion (y=year,d=day,h=hour,m=minute) to convert the # to the time to add. If you want to subtract time, make your number negative (i.e. -5_d). Generates date in iso8601 formatted string
timestamp([yyyy/MM/ddTHH:mm:ss,yyyy/MM/ddTHH:mm:ss]) Optional min date or date range Generates a timestamp (i.e. long value) of the specified range. Same rules as the date() Function apply
nowTimestamp() n/a Generates a timestamp (i.e. long value) of the now date.

Special Functions

Function Arguments Description
random(val1,val2,...) Literal values to choose from. Can be Strings, Integers, Longs, Doubles, or Booleans Randomly chooses from the specified values
counter(name) The name of the counter to generate Generates a one up number for a specific name. Specify different names for different counters.
this.propName propName = the name of another property Allows you to reference other values that have already been generated (i.e. they must come before). For example, this.test.nested-test will reference the value of test.nested-test in the previously generated json object. You can also specify a this. clause when calling other functions like date(this.otherDate) will generate a date after another generated date.
cur.propName propName = the name of another property at the same level as this property Allows you to reference other values at the same level as the current property being generated. This is useful when you want to reference properties within a generated array and you don't know the index of the array.
randomIncrementLong(name, baseValue, minStep, maxStep) The name of the value to generate, its base value and boundaries for the step Generates a random step number for a specific name. Specify different names for different counters.

Arrays

We have two super special Functions for use with arrays. They are repeat() and random(). The repeat() function is used to specify that you want the Generator to take the elements in the array, and repeat its generator a certain number of times. You can specify the number of times or if you provide no arguments, it will repeat it 0-10 times. Use it like so:

{
    "values": [
        "repeat(7)",
        {
        	"date": "date('2015/04/01T00:00:00', '2015/04/25T00:00:00')",
        	"count": "integer(1, 10)"
        }]
}

This will generate a json object that has 7 values each with different random values for date and count.

The random() function will tell the generator to pick a random element from the array and out put that element only. Use it like so:

{
    "values": [
        "random()",
        {
        	"date": "date('2015/04/01T00:00:00', '2015/04/25T00:00:00')",
        	"count": "integer(1, 10)"
        },{
        	"thing1": "random('red', 'blue')"
        }]
}

This will generate an array with one element that is either the element with date & count or the element with thing1 in it.

Custom Functions

To create a custom function, you must extend the TypeHandler as follows:

public class PhoneNumberType extends TypeHandler {

    public static final String TYPE_NAME = "phone";

    private String areaCode;

    @Override
    public void setLaunchArguments(String[] launchArguments) {
        super.setLaunchArguments(launchArguments);
        if (launchArguments.length >= 1) {
            areaCode = launchArguments[0];
        }
    }

    @Override
    public String getNextRandomValue() {
        return (areaCode != null ? areaCode : getRand().nextInt(111, 999)) + "-"
                + getRand().nextInt(111, 999) + "-"
                + getRand().nextInt(1111, 9999);
    }

    @Override
    public String getName() {
        return TYPE_NAME;
    }

}

Keep in mind:

  • You must make sure to add your class's package name to the workflow config as shown above.
  • getName() returns the name of the function. If you use an already existing function's name, it will be overwritten.
  • Calling getRand() requires that apache's math3 library is on the compiler's classpath.
  • getNextRandomValue() can return other primitive values, not just String.

Examples

Here is a Kitchen Sink example to show you all the different ways you can generate data:

{
    "strings": {
        "firstName": "firstName()",
        "lastName": "lastName()",
        "random": "random('one', \"two\", 'three')",
        "alpha": "alpha(5)",
        "alphaNumeric": "alphaNumeric(10)",
        "uuid": "uuid()"
    },
    "primatives": {
        "active": "boolean()",
        "rand-long": "long()",
        "rand-long-min": "long(895043890865)",
        "rand-long-range": "long(787658, 8948555)",
        "rand-int": "integer()",
        "rand-int-min": "integer(80000)",
        "rand-int-range": "integer(10, 20)",
        "rand-double": "double()",
        "rand-double-min": "double(80000.44)",
        "rand-double-range": "double(10.5, 20.3)"
    },
    "dates": {
        "rand-date": "date()",
        "min-date": "date(\"2015/03/01T00:00:00\")",
        "range-date": "date(\"2015/03/01T00:00:00\",\"2015/03/30T00:00:00\")",
        "now": "now()",
        "nowTimestamp": "nowTimestamp()",
        "5days-ago": "now(-5_d)",
        "timestamp": "timestamp(\"2015/03/01T00:00:00\",\"2015/03/30T00:00:00\")"
    },
    "literals": {
        "literal-bool": true,
        "literal-int": 34,
        "literal-long": 897789789574389,
        "literal-double": 45.33
    },
    "references": {
        "after-5days-ago": "date(this.dates.5days-ago)",
        "isItActive": "this.primatives.active"
    },
    "counters": {
        "counter1": "counter('one')",
        "counter2": "counter('two')",
        "counter2-inc": "counter('two')"
    },
    "randomIncrementLongs": {
        "randomIncrementLong1": "randomIncrementLong('one', 1, 1, 10)"
        "randomIncrementLong2": "randomIncrementLong('two', 0, 0, 5000)"
    },
    "array": [
        {
            "thing": "alpha(3)"
        },
        {
            "thing2": "alpha(3)"
        },
        {
            "thing3": true
        },
        {
            "thing4": "job"
        }
    ],
    "repeat-array": [
        "repeat(3)",
        {
            "thing1": "random('red','blue')",
            "thing2": "random('red', 'blue')"
        }
    ]
}

Which generates the following json:

{
    "strings": {
        "firstName": "Bob",
        "lastName": "Black",
        "random": "two",
        "alpha": "ihDra",
        "alphaNumeric": "F4mmpuSRDI",
        "uuid": "1f04040f-fc6c-4de0-a17f-c588d1b62e75"
    },
    "primatives": {
        "active": false,
        "rand-long": 8886422858603719071,
        "rand-long-min": 4635399365989710039,
        "rand-long-range": 6934933,
        "rand-int": 433941228,
        "rand-int-min": 519082863,
        "rand-int-range": 17,
        "rand-double": 7.339790593356424E307,
        "rand-double-min": 1.7261947151579352E308,
        "rand-double-range": 11.0806
    },
    "dates": {
        "rand-date": "1996-07-18T13:50Z",
        "min-date": "2015-03-09T13:50Z",
        "range-date": "2015-03-07T13:50Z",
        "now": "2015-04-28T13:50Z",
        "nowTimestamp": 1430250600027,
        "5days-ago": "2015-04-23T13:50Z",
        "timestamp": 1425588600027
    },
    "literals": {
        "literal-bool": true,
        "literal-int": 34,
        "literal-long": 897789789574389,
        "literal-double": 45.33
    },
    "references": {
        "after-5days-ago": "2015-04-24T13:50Z",
        "isItActive": false
    },
    "counters": {
        "counter1": 0,
        "counter2": 0,
        "counter2-inc": 1
    },
    "randomIncrementLongs": {
        "randomIncrementLong1": "1"
        "randomIncrementLong2": "0"
    },
    "array": [{
            "thing": "Psu"
        }, {
            "thing2": "yoU"
        }, {
            "thing3": true
        }, {
            "thing4": "job"
        }],
    "repeat-array": [{
            "thing1": "red",
            "thing2": "blue"
        }, {
            "thing1": "blue",
            "thing2": "blue"
        }, {
            "thing1": "blue",
            "thing2": "red"
        }]
}

json-data-generator's People

Contributors

andrewserff avatar bets90 avatar cwelton avatar data-tech avatar denisgillespie avatar dependabot[bot] avatar divineproportion avatar drakulavich avatar jmayrbaeurl avatar jocapc avatar naren-streamotion avatar orpheuslummis avatar pmitra43 avatar ygalblum avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

json-data-generator's Issues

Adding custom function

Discussed in #89

Originally posted by GMolozis April 25, 2024
I am trying to add a custom function that generates country data in a similar way firstName is generated. I created the class in the same directory the other types are defined but
according to the README I have to register the package name to the WorkflowConfig. So I added the package "net.acesinc.data.json.generator.config" into the WorkflowConfig.java file but the function does not work. Any suggestions?

Allow a DataSource to be used to provide values in generated items

Sometimes, we have a need to generate values that are based on some other source of data. For example, we might have an existing source of data that provides information about airports around the world. We might want to use the actual names, locations, or callsigns of those airports when generating events. So we would like to have a new Data Source that you could provide to the generator that could be pulled from when each event is generated. Then you should be able to reference the values within the items pulled from the data source in your generated events.

Exception during execution

Hello!
Congratulations on your code, it is helping me a lot!

I'm trying to connect the kafka and the data-stream, but it has not worked.
The zookeeper and Kafka are ok, I have a topic, consumer and producer.
Always get this error displayed below.
Can you help me please?

root@vinicius-vm-3:/home/vinicius/json-data-generator/files# java -jar json-data-generator-1.2.2-SNAPSHOT.jar producer.json 2016-10-06 15:27:01,636 INFO n.a.d.j.g.JsonDataGenerator [main] Overriding Simulation Config file from command line to use [ producer.json ] 2016-10-06 15:27:01,643 DEBUG n.a.d.j.g.JsonDataGenerator [main] Creating Simulation Runner using Simulation Config [ producer.json ] 2016-10-06 15:27:01,965 INFO n.a.d.j.g.JsonDataGenerator [main] Adding Kafka Producer with properties: {type=kafka, broker.server=127.0.0.1, broker.port=9092, topic=topico-json-data, sync=false} SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. Exception in thread "main" java.lang.NullPointerException at net.acesinc.data.json.generator.log.KafkaLogger.<init>(KafkaLogger.java:49) at net.acesinc.data.json.generator.JsonDataGenerator.<init>(JsonDataGenerator.java:51) at net.acesinc.data.json.generator.JsonDataGenerator.main(JsonDataGenerator.java:107) root@vinicius-vm-3:/home/vinicius/json-data-generator/files#

Thanks
Sorry my english

Add a FileLogger

The json-data-generator should be able to write json documents to separate files. This would generate 1 json event per file.

Error in providing sim config file

If i provide the sim config as specified in documentation, i get the error below.

$ java -jar json-data-generator-1.2.2-SNAPSHOT.jar conf\exampleSimConfig.json 2017-03-12 14:38:38,462 INFO n.a.d.j.g.JsonDataGenerator [main] Overriding Simulation Config file from command line to use [ conf\exampleSimConfig.json ] 2017-03-12 14:38:38,466 DEBUG n.a.d.j.g.JsonDataGenerator [main] Creating Simulation Runner using Simulation Config [ conf\exampleSimConfig.json ] 2017-03-12 14:38:38,681 ERROR n.a.d.j.g.JsonDataGenerator [main] Error getting Simulation Config [ conf\exampleSimConfig.json ] com.fasterxml.jackson.databind.JsonMappingException: No content to map due to end-of-input at [Source: UNKNOWN; line: 1, column: 1] at com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148) ~[jackson-databind-2.5.2.jar:2.5.2] at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:3607) ~[jackson-databind-2.5.2.jar:2.5.2] at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3547) ~[jackson-databind-2.5.2.jar:2.5.2] at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2648) ~[jackson-databind-2.5.2.jar:2.5.2] at net.acesinc.data.json.generator.config.JSONConfigReader.readConfig(JSONConfigReader.java:42) ~[json-data-generator-1.2.2-SNAPSHOT.jar:1.2.2-SNAPSHOT] at net.acesinc.data.json.generator.JsonDataGenerator.getSimConfig(JsonDataGenerator.java:103) ~[json-data-generator-1.2.2-SNAPSHOT.jar:1.2.2-SNAPSHOT] at net.acesinc.data.json.generator.JsonDataGenerator.(JsonDataGenerator.java:35) [json-data-generator-1.2.2-SNAPSHOT.jar:1.2.2-SNAPSHOT] at net.acesinc.data.json.generator.JsonDataGenerator.main(JsonDataGenerator.java:117) [json-data-generator-1.2.2-SNAPSHOT.jar:1.2.2-SNAPSHOT] Exception in thread "main" java.lang.NullPointerException at net.acesinc.data.json.generator.JsonDataGenerator.startRunning(JsonDataGenerator.java:95) at net.acesinc.data.json.generator.JsonDataGenerator.main(JsonDataGenerator.java:132) 2017-03-12 14:38:38,693 INFO n.a.d.j.g.JsonDataGenerator [Thread-1] Shutdown Hook Invoked. Shutting Down Loggers Exception in thread "Thread-1" java.lang.NullPointerException at net.acesinc.data.json.generator.JsonDataGenerator.stopRunning(JsonDataGenerator.java:99) at net.acesinc.data.json.generator.JsonDataGenerator$1.run(JsonDataGenerator.java:123)

When i looked into the source code looks like the sim config is read as stream so i used the command below and i was able to see the generator working.

java -jar json-data-generator-1.2.2-SNAPSHOT.jar | cat conf\exampleSimConfig.json

I am using windows to run this generator. Let me know if this is a valid issue or not.

Writing multiple iterations to a single file

First off, thanks so much for creating the json-data-generator. It's exactly what I needed at this point in my project, and it will be invaluable going forward. Thanks for your hard work!

Can you recommend the best way (i.e. the correct place in the code) to modify the workflows so as to write all events/iterations to a single file, as opposed to a single file for each workflow iteration? I'm not fluent in Java, so I'm not certain where/how to perform this change.

Thanks.

Error in double() function

Hi!

Running your jackieChan example sedngin it to kafka it returns next exception:

2016-12-06 18:42:33,202 WARN n.a.d.j.g.RandomJsonGenerator [Thread-2] Error creating type [ double(1.0,10.0) ]. Prop [ strength ] being ignored in output. Reason: For input string: "4,3396" 2016-12-06 18:42:33,202 DEBUG n.a.d.j.g.RandomJsonGenerator [Thread-2] Error creating type [ double(1.0,10.0) ]. Prop [ strength ] being ignored in output. java.lang.NumberFormatException: For input string: "4,3396" at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:2043) ~[?:1.8.0_111] at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110) ~[?:1.8.0_111] at java.lang.Double.parseDouble(Double.java:538) ~[?:1.8.0_111]

and the strenght value is ignored in output:

2016-12-06 18:42:33,219 DEBUG n.a.d.j.g.l.KafkaLogger [Thread-2] Sending event to Kafka: [ {"timestamp":"2016-12-06T18:42:33.163Z","style":"KUNG_FU","action":"JUMP","weapon":"ROPE","target":"HEAD"} ]

Thanks!

data generator to kafka

Hi everyone,

I would like to create random GPS positions that in the end I plan to send to a Kafka topic. In a first step I have only use the "logger" producer which worked perfectly, but I am failing to fill the Kafka topic with the same data. I have created a new Kafka topic ("RandomGPS") for this aim and it remains empty.

Here are the producers settings

"producers": [{
"type": "kafka",
"broker.server": "172.17.0.3",
"broker.port": 9092,
"topic": "RandomGPS",
"flatten": true,
"sync": true
},{
"type":"logger"
}]

Is there something I am missing ? Thanks for your help.

Troubles running default config

  1. Clone repository: git clone https://github.com/everwatchsolutions/json-data-generator.git (1.4.1)
  2. Compile using mvn clean package from command line (success), maven 3.3.9, Oracle Java 1.8.0_152, Linux Mint 18.2
  3. Copy bin.tar and .jar files to separate directory
  4. Modify classes/defaultSimConfig.json by removing nats producer section (only default logger is left)
  5. Copy classes/defaultSimConfig.json and classes/normalUser1Workflow.json to above directory
  6. Enter above directory and start program using command java -jar json-data-generator-1.4.1-SNAPSHOT.jar defaultSimConfig.json

Expected:
Demo project is run

Actual:
achamier@deathstar ~/git/json-data-generator/run $ java -jar json-data-generator-1.4.1-SNAPSHOT.jar defaultSimConfig.json
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/eclipse/paho/client/mqttv3/MqttException
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.eclipse.paho.client.mqttv3.MqttException
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more

When generator is run from target directory it runs without an issue.
I googled some solution, there are several threads which ends in "oh it looks like you forgot about dependencies" but withour real solution, and as far as I can see mqttv3 is declared both in pom.xml and json-data-generator.iml. Any help?

Maintain state across events

There have been multiple requests for the ability to reference values of previous events in new events, however we currently only maintain state within a single event. I'm creating this ticket to track interest in this feature. Unfortunately this isn't an easy feature to implement as we would have to track each event generated which would mean we would need a cache of some kind. There would also need to be a way to reference other events that I'm not sure how people would know which event they want to pull values from.

If you have any comments about this feature, please add them below.

support: question

Hi, I have a a workflow containing one step with three config elements. It should generate events which are linked via id. Is there a way to use a value from message 1 from config in same config for message 2?

regards

Error while using the repeat() function in my Workflow defenition

2018-08-01 02:11:07,951 ERROR n.a.d.j.g.SimulationRunner [main] Error reading config: test
com.fasterxml.jackson.databind.JsonMappingException: Can not instantiate value of type [map type; class java.util.LinkedHashMap, [simple type, class java.lang.String] -> [simple type, class java.lang.Object]] from String value ('repeat(3)'); no single-String constructor/factory method
at [Source: java.io.BufferedInputStream@5b8dfcc1; line: 9, column: 23] (through reference chain: net.acesinc.data.json.generator.workflow.Workflow["steps"]->java.util.ArrayList[0]->net.acesinc.data.json.generator.workflow.WorkflowStep["config"]->java.util.ArrayList[0])
at com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148) ~[jackson-databind-2.5.2.jar:2.5.2]
at com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:843) ~[jackson-datab

myWorkflow.json:

GNU nano 2.2.6 File: conf/myWorkflow.json

{
"eventFrequency": 4000,
"varyEventFrequency": true,
"repeatWorkflow": true,
"timeBetweenRepeat": 60000,
"varyRepeatFrequency": true,

    "steps":[{
        "config":["repeat(3)",      {
                "timestamp": "nowTimestamp()",
                "system": "BADGE",
                "actor": "bob",
                "action": "ENTER",
                "objects": ["Building 1"],
                "location": "45.5,44.3",
                "message": "Entered Building 1"
            }

],
"duration": 0

}
]
}

RandomType is thread unsafe

Hi there!
Thanks a lot for the great tool!

I've changed exampleSimConfig.json to check JSON generating with multiple threads:

{
    "workflows": [{
            "workflowName": "test",
            "workflowFilename": "exampleWorkflow.json"
        }, {
            "workflowName": "test",
            "workflowFilename": "exampleWorkflow.json"
        },
        {
            "workflowName": "test",
            "workflowFilename": "exampleWorkflow.json"
        }
    ],
    "producers": [{
            "type": "logger"
        },
        {
            "type": "file",
            "output.directory": "test_data/test",
            "file.prefix": "test_",
            "file.extension": ".json"
        }
    ]
}

I've got many errors during simulation:

DEBUG n.a.d.j.g.RandomJsonGenerator [Thread-2] Error creating type [ random("PRINT","OPEN","COPY") ]. Prop [ action ] being ignored in output.
org.apache.commons.math3.exception.NumberIsTooLargeException: lower bound (0) must be strictly less than upper bound (-1)
	at org.apache.commons.math3.distribution.UniformIntegerDistribution.<init>(UniformIntegerDistribution.java:78) ~[commons-math3-3.5.jar:3.5]
	at org.apache.commons.math3.random.RandomDataGenerator.nextInt(RandomDataGenerator.java:198) ~[commons-math3-3.5.jar:3.5]
	at net.acesinc.data.json.generator.types.RandomType.getNextRandomValue(RandomType.java:48) ~[json-data-generator-1.2.2-SNAPSHOT.jar:1.2.2-SNAPSHOT]
	at net.acesinc.data.json.generator.RandomJsonGenerator.processProperties(RandomJsonGenerator.java:102) [json-data-generator-1.2.2-SNAPSHOT.jar:1.2.2-SNAPSHOT]
	at net.acesinc.data.json.generator.RandomJsonGenerator.processProperties(RandomJsonGenerator.java:133) [json-data-generator-1.2.2-SNAPSHOT.jar:1.2.2-SNAPSHOT]
	at net.acesinc.data.json.generator.RandomJsonGenerator.generateJson(RandomJsonGenerator.java:50) [json-data-generator-1.2.2-SNAPSHOT.jar:1.2.2-SNAPSHOT]
	at net.acesinc.data.json.generator.EventGenerator.generateEvent(EventGenerator.java:243) [json-data-generator-1.2.2-SNAPSHOT.jar:1.2.2-SNAPSHOT]
	at net.acesinc.data.json.generator.EventGenerator.executeStep(EventGenerator.java:182) [json-data-generator-1.2.2-SNAPSHOT.jar:1.2.2-SNAPSHOT]
	at net.acesinc.data.json.generator.EventGenerator.runSequential(EventGenerator.java:71) [json-data-generator-1.2.2-SNAPSHOT.jar:1.2.2-SNAPSHOT]
	at net.acesinc.data.json.generator.EventGenerator.runWorkflow(EventGenerator.java:53) [json-data-generator-1.2.2-SNAPSHOT.jar:1.2.2-SNAPSHOT]
	at net.acesinc.data.json.generator.EventGenerator.run(EventGenerator.java:257) [json-data-generator-1.2.2-SNAPSHOT.jar:1.2.2-SNAPSHOT]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]

I would appreciate any suggestion to solve this issue.

Is there a way to generate IP address?

Thanks for great tool. I'm trying to generate IP address, but the following don't work:

stringMerge(.,integer(100, 180),integer(50, 80),integer(100, 160),integer(180, 240))
integer(100, 180).integer(50, 80).integer(100, 160).integer(180, 240)

Is this possible without some big hacks?

Can't generate array of double

Hi, I need to generate json messages like the ones produced by CollectD ( a metrics collector among many, but I should use that...).
Messages are like these two:

  • [{"values":[0,0],"dstypes":["derive","derive"],"dsnames":["user","syst"],"time":1519314305.896,"interval":10.000,"host":"host1","plugin":"processes","plugin_instance":"pengine","type":"ps_cputime","type_instance":""}]
  • [{"values":[696],"dstypes":["gauge"],"dsnames":["value"],"time":1519314305.896,"interval":10.000,"host":"host2","plugin":"processes","plugin_instance":"stonithd","type":"ps_stacksize","type_instance":""}]

After lot of configurations I wasn't able to produce a message with {"values":[0,0]......
Seems that are not supported array of values of primitives types.
Have you any suggestion ?
Thanks
R.

Main thread never terminates

Hi, I have just recently started using the json-data-generator, and it is a useful tool.
I noticed that the main thread never terminates, even after the simulation is finished. I might be missing something. I am wondering if this is on purpose or is it a bug?

From looking at the code, it seems that the main thread of the application is left waiting at line 162 of JsonDataGenerator forever (master branch), because the status of the "running" boolean within SimulationRunner, which is accessed using isRunning() is never updated once the simulation is finished.

The following is the while loop which seems to always last forever:

        while (gen.isRunning()) {
            try {
                Thread.sleep(1000);
            } catch (InterruptedException ex) {
                //wakie wakie!
            }
        }

There are "running" booleans in three classes: JsonDataGenerator, SimulationRunner and EventGenerator. It looks like the problem is in the SimulationRunner. It creates a group of threads, and starts them running. But when you call the isRunning() method on it, it should be checking the running status of all the threads that it started. If it did that, its isRunning() method would return false when all the threads had ended, and so the JsonDataGenerator would also know when the simulation had ended, and the main thread would not wait at line 162 forever.

Within EventGenerator, the running boolean in there is set to false when a particular workflow is finished. But at the level of SimulationRunner, it seems to never update its boolean.

The consequence of this is that when a user runs the jar from the command line they have to do a CTRL+C to kill it.

Please let me know if I am missing something. Thanks

Not able to generate JSON output from the schema

Hi,

I am trying to using the framework to generate the json output using the sample schema as below.

{
"$schema": "http://json-schema.org/draft-04/schema",
"title": "Basic Info",
"type": "object",
"properties": {
"firstName": {
"type": "string"
},
"age": {
"type": "number"
}
}
}

I am not able to convert or parse the above schema into json output. The code doesn't recognize it as a schema rather it just considers it as another json documents and parses the same. Meaning, it just converts the above schema as such into a json output. I could see from the jsondatagenerator.java classes, that we feed the sample json files like config-array-test.json and convert them. However, the schema i am trying to parse or convert doesn't work. Requesting your inputs on the same.

Below is the output:

{"$schema":"http://json-schema.org/draft-04/schema",
"title":"BasicInfo",
"type":"object",
"properties.firstName.type":"string",
"properties.age.type":"number"}

HTTP-POST error 404

Hi there! thanks for this excellent work. I am very new to this kind of json response. I was using http-post to generate data. but I got 404 error.
floodConfig.json
{ "workflows": [{ "workflowName": "flood", "workflowFilename": "floodWorkflow.json" }], "producers": [{ "type": "http-post", "url": "http://localhost:80/ingest" }] }
floodWorkflow.json
{ "eventFrequency": 2000, "varyEventFrequency": true, "repeatWorkflow": true, "timeBetweenRepeat": 2000, "varyRepeatFrequency": true, "steps": [{ "config": [{ "id": 1, "w_lvl": "double(10.0,100.0)" }, { "id": 2, "w_lvl": "double(10.0,100.0)" }, { "id": 3, "w_lvl": "double(10.0,100.0)" }, { "id": 4, "w_lvl": "double(10.0,100.0)" }] }] }

After I run this configuration, I copy and paste the URL which is http://localhost:80/ingest and get 404 error. Please teach me senpai

Release new version

Hi,

the last released version is from May 25, 2016
Since then, a lot new features were added (appreciate that).
Don't you plan to make periodic releases any more, because they are quite nice to use in e.g. the test automation by just downloading it instead of cloning+building.

BUG: closing parenthesis ")" breaks parsing arguments

Steps:

  1. Create field random with subfields that contain closing parenthesis, e.g.
    "function_name": "random('function1()', 'function2()')"

Expected:
The field should have been filled with one of the provided values, e.g. function1() or function2()

Actual:
Parsing argument breaks after first closing parenthesis disregarding any quoting, .e.g. 'function1(

Generated 0.0 events/s

Hi

I'm running the generator, with this configuration:

"eventFrequency": 1, "steps": [{ "config":[{ "user_id": "uuid()", "page_id": "uuid()", "ad_id": "random('cmp1', 'cmp2', 'cmp3', 'cmp4', 'cmp5', 'cmp6', 'cmp7', 'cmp8')", "ad_type": "random('banner', 'modal', 'sponsored-search', 'mail', 'mobile')", "event_type": "random('view', 'click', 'purchase')", "event_time": "now()", "ip_address": "127.0.0.1" }], "duration": 100000 }] }

The events are sent to a Kafka topic.

At the end of the execution, it says that it generated 0.0 events/s. There is a way to fix that ?

Generate JSON web server logs

Hi,

I have successfully used your tool to generate fake log entries that simulate logs generate by some web server. This includes header/response data URL, duration, exceptions, etc.

I would like to share my workflow/config because I believe that others might be interested to use it or enhance it.
I have submitted the files in pull request #38

Regards,
Jovan

http-post not working how to debug

I am trying to produce message and publish it to secure end point. I started it with ssl parameters but somehow i don't see any data being published. During the starup it gave following message

Adding HTTP Post Logger with properties: {type=http-post, url=https://server-msrt-ft2.xxx.com:3905/events/com.message-router-EVENT-INPUT}

post this message I don't see anything in the log on the command line that what's happening. I only know that my consumer is not seeing any message. I added extra producer to see what it is producing and found it is creating json files but not sure why http-post is not working. Can someone suggest how to debug this ?
I have attached log file for reference. Following is my Simconfig file:

{
"workflows": [{
"workflowName": "newsyslog",
"workflowFilename": "ecnewsyslogWorkflow.json"
}, {
"workflowName": "newsnmp",
"workflowFilename": "ecnewsnmpWorkflow.json"
}
],
"producers": [ {
"type": "http-post",
"url": "https://server-msrt-ft2.xxx.com:3905/events/com.message-router-EVENT-INPUT"
}
]
}

Hints with simulation configuration

Hi Folks,

I'd like to use json-data-generator to generate events that mimics users interacting with a content application. Due to the nature of the application however there are a few requirements to be met:

1-content has an ordered structure of 1-cover x N items that impose a given navigation condition (ie, in order to see the 2nd item of a given content you must see the cover and 1st item).
2-I need to simulate multiple users navigating at the same time (or in a given interval)

I noticed on the readme that there is some kind of cursor or references, but it was not clear how can I use them.

Can you give some directions on how to build those events?

Thanks in advance

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.