Section VI - testing the service

Step 1 - Checking the services

Let's first make sure all our services are running and restart our reporting service.

In a new terminal, run the following command:

./rust-daas/target/debug/myapp_reporting

In another terminal, let's run the sourcing script.

./scripts/curl-sourcing.sh 

You should see all the services printing to the console about the data they have touched.

Sourcing RESTful service

[2020-11-09T13:24:06Z INFO  actix_web::middleware::logger] 127.0.0.1:33482 curl/7.61.1
[2020-11-09T13:24:06Z INFO  actix_web::middleware::logger] 127.0.0.1:33482 "POST /order/clothing/iStore/5000 HTTP/1.1" 200 15 "-" "curl/7.61.1" 0.002745

Genesis service

[2020-11-09T13:24:06Z INFO  daas::service::processor] Putting document order~clothing~iStore~5000 in S3
[2020-11-09T13:24:06Z INFO  daas::service::processor] Brokering document order~clothing~iStore~5000 ... 

Order Clothing service

ArchConfWorkshopUser:~/environment $ ./rust-daas/target/debug/myapp_order_clothing 
Clothing Orders processor is running ...
Press [Enter] to stop the Clothing Orders processor.
Order Number 5000 from the iStore has a status of "new"...
Retreiving leather_jacket file

Reporting service

Step 2 - Calling the Reporting RESTful service

Let's first make sure the returned payload is correct based on the resource path.

In an available terminal, run the following script using a specific product

NOTE: The JSON is an object for the specific product.

Now let's make sure the payload is correct when a product is not specified.

NOTE: The JSON is an array verses an object

Step 3 - Testing the Data Provisioning Flow

We still haven't verified that our DaaS platform is working when sourcing dynamic content. Let's being by sourcing some variable content.

Open the curl-sourcing.sh script in the ./scrips directory so that we are ordering a different product.

IMPORTANT: Don't forget to save the file

Rerun the ./scripts/curl-sourcing.sh command.

We want to make sure the order has been properly aggregated to our reporting data source, so let's rerun the ./scripts/curl-reporting.sh command

Let's make another change to the curl-sourcing.sh script by changing the name of the store from iStore to myStore

After you rerun the ./scripts/curl-sourcing.sh script, you should get a payload returned stating that the data cannot be processed.

This is because the DaaS SDK automatically verifies that the data being sent is coming from the original source that created it. This is possible because of the Data Tracker Chain feature from the pbd.

Update the curl-sourcing.sh script to the following, which has a Data-Tracker-Chain value that matches the resource path order/clothing/myStore/5000:

You should be able to confirm the following items:

Sourcing

  • {"status":"ok"} response

  • ./local_storage/clothing/myStore/5000 directory with a your DaaSDocument json file

Genesis Processor

Kafka topics dynamically created

Provisioning Processor

Reporting

Last updated

Was this helpful?