eKuiper FVT (functional verification tests) covers following scenarios.
The scenarios will be invoked automatically in Github actions with any new code commit or push request. Another Raspberry Pi continouly integration environment will also be ready for running test cases in ARM environment. So if receives any failed FVT running, please re-check the code or update the scripts if necessary.
eKuiper project uses JMeter for writing the scripts for following reasons,
BeanShell Assertion
, which can be used for extract and process complex message contents.Prepare JMeter
eKuiper uses JMeter for FVT test scenarios, includes REST-API, CLI and end to end test scenarios.
Install MQTT broker
Because test scripts uses MQTT broker for source and sink of eKuiper rule, an MQTT broker is required for running the scripts. If you use a broker that cannot be accessed from tcp://127.0.0.1:1883
, you should modify the script and specify your MQTT broker address.
etc/mqtt_source.yaml
.Modify the script file that you want to run.
mqtt_srv
: The default value is 127.0.0.1
, you need to update it if you have a different broker. Refer to below screenshot, Test Plan > User Defined Variables > mqtt_srv
.
Test Plan > User Defined Variables > srv
: The eKuiper server address, by default is at 127.0.0.1
.Test Plan > User Defined Variables > rest_port
: The eKuiper server RestAPI port, by default is 9081
, please change it if running eKuiper at a different port.Run JMeter
For most of scripts, you can just start JMeter by default way, such as bin/jmeter.sh
in Mac or Linux. But some of scripts need to pass some parameters before running them. Please refer to below for detailed. Please make sure you start MQTT broker & eKuiper before running the tests.
The script tests the basic steps for stream operations, include both API & CLI.
The script need to be told about the location of eKuiper install directory, so script knows where to invoke eKuiper CLI.
Specify the base
property in the JMeter command line, the base
is where eKuiper installs. Below is command for starting JMeter.
bin/jmeter.sh -Dbase="/opt/kuiper"
The script tests stream and rule operations.
The script need to be told about the location of eKuiper install directory, so script knows where to invoke eKuiper CLI.
base
property in the JMeter command line, the base
is where eKuiper installs.Specify the fvt
property in the JMeter command line, the fvt
is where you develop eKuiper, script will read rule file test/rule1.txt
from the location.
Modify mqtt.server
to your MQTT broker address in file test/rule1.txt
.
So below is command for starting JMeter.
bin/jmeter.sh -Dbase="/opt/kuiper" -Dfvt="/Users/rockyjin/Downloads/workspace/edge/src/ekuiper"
The scenario tests a rule that select all of records from a stream.
iot_data.txt
, where the 1st column is device_id
, the 2nd column is temperature
, the 3rd column is humidity
. There are totally 10 records in the file.SELECT * FROM demo
, so all of data will be processed and sent to sinks.Another JMeter mock-up user subscribes MQTT result topic. JMeter validates message number and content sent by the rule. If the record cotent is not correct then JMeter response assertion will be failed. If record number is not correct, the script will not be stopped, until CI (continuous integration) pipeline kills it with timeout settings. If you run the script in local, you'll have to stop the test manually.
This scenario test is very similar to the last one, except the rule filters the record with a condition.
SELECT * FROM demo WHERE temperature > 30
, so all of the data that with temperature less than 30 will be fitered. The script read data from file iot_data.txt
, totally 10 records.Another JMeter mock-up user subscribes MQTT result topic, and expected result are saved in file select_condition_iot_data.txt
. If the record cotent is not correct then JMeter response assertion will be failed. If record number is not correct, the script will not be stopped, until CI (continuous integration) pipeline kills it with timeout settings. If you run the script in local, you'll have to stop the test manually.
The script automated steps described in this blog, except for the sink target changes to local EMQ broker (not AWS IoT Hub).
The processing SQL is as following.
SELECT avg(temperature) AS t_av, max(temperature) AS t_max, min(temperature) AS t_min, COUNT(*) As t_count, split_value(mqtt(topic), "/", 1) AS device_id FROM demo GROUP BY device_id, TUMBLINGWINDOW(ss, 5)
Another JMeter mock-up user subscribes MQTT result topic, and it waits for 15 seconds to get all of analysis result arrays. With the beanshell assertion, it calculates total number of t_count
for device 1 & 2. If the number is not correct, then it fails.
This script creates stream and rule, then get metrics of rule, and assert message number processed in stream processing line. Additionally, script will stop, start or restart the rule, and verify the metric value of rule.
Another JMeter mock-up user subscribes MQTT result topic, and assert message number and contents.
The script tests scenarios for following cases,
[{}]
.Another JMeter mock-up user subscribes MQTT result topic, and assert message number and contents.
ORDER BY
statement based on Aggregation rule
.SELECT temperature, humidity, split_value(mqtt(topic), "/", 1) AS device_id FROM demo GROUP BY TUMBLINGWINDOW(ss, 10) ORDER BY device_id DESC, temperature
The test script is used for testing eKuiper EdgeX source. To run the script,
An EdgeX message bus publish tool should be compiled and run during running test.
# go build -o test/edgex/pub test/edgex/pub.go
Run the JMeter with following command, and specify the fvt
property in the JMeter command line, the fvt
is where you develop eKuiper, script will search test/edgex/pub
from the location.
bin/jmeter.sh -Dfvt="/Users/rockyjin/Downloads/workspace/edge/src/ekuiper"
The processing SQL is SELECT * FROM demo WHERE temperature > 30
, so all of the data that with temperature less than 30 will be fitered.
Another JMeter mock-up user subscribes MQTT result topic, and assert message number and contents.
The test script is used for testing specifying another EdgeX source configurations in eKuiper.
edgex.yaml
configuration file, below additional configurations are specified. application_conf: #Conf_key
protocol: tcp
server: localhost
port: 5571
topic: application
CONF_KEY
keyword to use overrided configuration value that specified in edgex.yaml
. CREATE STREAM application () WITH (FORMAT="JSON", TYPE="edgex", CONF_KEY = "application_conf")
As same steps that required in the select_edgex_condition_rule.jmx
, EdgeX value descriptor service & message bus publish tool should be ready.
The test script verifies EdgeX message bus sink. Only one message meet the condition of created rule, and it will be sent to EdgeX message bus sink.
As with the previous 2 testcases, besides to prepare pub
application, another sub
application should also be prepared.
# go build -o test/edgex/sub/sub test/edgex/sub/sub.go
The test script verifies EdgeX array data type support. The rule uses JSON expression in both SELECT
and WHERE
clause. The sink result is sent to MQTT broker, and it verifies the project result in sampler assertions.
This test script, you need to prepare pub
application.
The script is an end-2-end plugin test. It requires a mock http server, and also a plugin.
```shell
# go build -o test/plugins/service/http_server test/plugins/service/server.go
```
The script is an end-2-end portable plugin test. It requires a mock http server, and also a plugin which will be built in the prepare_plugin.sh
.
```shell
# go build -o test/plugins/service/http_server test/plugins/service/server.go
```
The script has two parts to test go and python sdk respectively. It covers the portable plugin CRUD operations and the running of the source, function and sink that are defined in that plugin.
The test script verifies HTTP pull source. It sends request to a server. The script set incremental to true, so it will compare with last result; If response of two requests are the same, then will skip sending out the result. This script also requires to run server, please refer to last testcase for how to compile and run.
This script verifies the binary data support. The rule consumes the binary image data sent from MQTT broker, and then processed by the rule, finally the image data will be sent back to MQTT broker again. The script verifies BASE64 of image that produced by eKuiper rule engine.
binary_image_hex.txt
: the data file of image.binary_image_base64.txt
: the BASE64 file of image.
This script verifies the batch table support.
iot_data_id.txt
which defines an id, temperature and humidity.lookup.json
as the content. It is like a dictionary which maps the name and size to each id.The result data will be sent back to MQTT broker again and compared with table_static_result_data.txt
.
This script verifies the continuous updated table support. Unlike the static table, the continuous table will be updated once received new data.
iot
. The sent data are read from file iot_data_multi_topics.txt
when the topic name is matched which defines temperature and humidity.state
. The sent data is also from file iot_data_multi_topics.txt
when the topic name is matched which defines the state. The table records the latest state from the state
topic which is working like a control topic to switch on or off of the processing.The result data will be sent back to MQTT broker again and compared with table_cont_result_data.txt
.
This script verifies the shared source instances across rules. By specifying shared property to true in the stream definition, all rules using the stream will all share the same source instance for better performance.
SHARED
option true. The sent data are read from file iot_data.txt
which defines temperature and humidity.The result data will be sent back to MQTT broker again and compared with select_condition_iot_data.txt
, select_condition_iot_data2.txt
and select_condition_iot_data3.txt
respectively.
The test script is used for testing eKuiper EdgeX connection selector. To run the script,
Redis Server need installed since EdgeX are using Redis as message bus
etc/connections/connection.yaml
, make sure the server and port is correct
yaml
edgex:
redisMsgBus: #connection key
protocol: redis
server: 127.0.0.1
port: 6379
type: redis
redis_srv
is set to the right redis addressAn EdgeX message bus publish/subscribe tool should be compiled and run during running test.
# go build -o test/edgex/pub test/edgex/pub.go
# go build -o test/edgex/sub test/edgex/sub.go
fvt
property in the JMeter command line, the fvt
is where you develop eKuiper, script will search test/edgex/pub
and test/edgex/sub
from the location. bin/jmeter.sh -Dfvt="/Users/rockyjin/Downloads/workspace/edge/src/ekuiper"
The Jmeter script will direct eKuiper to subscribe data from EdgeX redis message bus according to the connection info from etc/connections/connection.yaml
and sink the result to EdgeX redis message bus using the same
connection info
Jmeter use pub tool
to generate data, sub tool
to get the processed result and assert message number and contents
The test script is used for testing Redis KV Storage. To run the script,
Redis Server need installed since eKuiper will use Redis as kv storage
etc/kuiper.yaml
, make sure type is set to redis
, and server, port and password is correct
yaml
store:
#Type of store that will be used for keeping state of the application
type: redis
redis:
host: localhost
port: 6379
password: kuiper
#Timeout in ms
timeout: 1000
sqlite:
#Sqlite file name, if left empty name of db will be sqliteKV.db
name:
redis_srv
is set to the right redis addressA redis client tool should be compiled and run during running test.
# go build -o test/redis/set test/redis/set.go
fvt
property in the JMeter command line, the fvt
is where you develop eKuiper, script will search test/redis/set
from the location. bin/jmeter.sh -Dfvt="/Users/rockyjin/Downloads/workspace/edge/src/ekuiper"
The Jmeter script will create streams and rules by rest api, all metadata will be stored in redis.
Jmeter use redis client tool
to get the configuration directly from redis and compare it with the rest api created ones
This script verifies the memory source/sink to form a rule pipeline of multiple rules and the usage of dynamic properties in the mqtt sink to publish the result to multiple topics. This test also verifies event time window to guarantee the result is constant.
iot_data_ts.txt
which contains timestamp.