daas workshop
  • Hands-On experience building a Data as a Service Platform
  • Set Up
    • Creating a Workstation
    • Installing Tools
    • Starting Kafka
    • Congratulations
  • Module I - Overview of the DaaS Pattern
    • Section I - The Overall Pattern
    • Section II - Data Sourcing
    • Section III - Data Provisioning
    • Section IV - Data Consumption
  • Module II - Building a Rust Project
    • Section I - Create a Package
    • Section II - Creating a Library
    • Section III - Creating an Executable
    • Section IV - Creating a Hello World RESTful Endpoint
      • Section IV - manifest
      • Section IV - library
      • Section IV - module
      • Section IV - integrated testing
      • Section IV - executable
  • Module III - Building a RESTful Endpoint for Sourcing Data
    • Section I - Overview
    • Section II - manifest
    • Section III - executable
    • Section IV - starting the service
    • Section V - service testing
  • Module IV - Building a Genesis Microservice for Processing the Sourced Data
    • Section I - Overview
    • Section II - manifest
    • Section III - executable
    • Section IV - starting the service
    • Section V - service testing
  • Module V - Building a Provisioning Microservice
    • Section I - Overview
    • Section II - manifest
    • Section III - executable
    • Section IV - starting the service
    • Section V - adding the business logic
    • Section VI - testing the service
  • Module VI - Building a RESTful Endpoint for Publishing Reporting Data
    • Section I - Overview
    • Section II - manifest
    • Section III - executable
    • Section IV - starting the service
    • Section V - adding the business logic
    • Section VI - testing the service
  • Privacy Design Strategies
  • Further Exploration
Powered by GitBook
On this page

Was this helpful?

  1. Module IV - Building a Genesis Microservice for Processing the Sourced Data

Section IV - starting the service

We are now ready to start the microservice that will listen for data on the genesis topic and stash it in the S3 bucket, then send it downstream to the next topic which is dynamically built based on the metadata of the DaaSDocument.

There are 2 ways to start the service.

  1. Running using cargo run command while developing (local service testing)

IMPORTANT: Run the executable in a new terminal so that you can have the sourcing and the genesis services running in parallel.

NOTE: we provide the argument --bin myapp_genesis because there are now multiple executables and must specify which one to run.

ArchConfWorkshopUser:~/environment/rust-daas (master) $ cargo run --bin myapp_genesis
    Finished dev [unoptimized + debuginfo] target(s) in 0.42s
     Running `target/debug/myapp_genesis`
Genesis processor is running ...
Press [Enter] to stop the Genesis processor.
[2020-11-04T21:28:23Z INFO  daas::service::processor] Putting document order~clothing~iStore~5000 in S3
[2020-11-04T21:28:23Z INFO  daas::service::processor] Brokering document order~clothing~iStore~5000 ... 
[2020-11-04T21:28:27Z INFO  daas::service::processor] Putting document order~clothing~iStore~5000 in S3
[2020-11-04T21:28:27Z INFO  daas::service::processor] Brokering document order~clothing~iStore~5000 ... 

To stop the service, use ctrl + c.

2. Running using the executable.

ArchConfWorkshopUser:~/environment/rust-daas (master) $ cargo build
   Compiling kafka v0.8.0
   Compiling daas v0.2.0
   Compiling rust-daas v0.1.0 (/home/ec2-user/environment/rust-daas)
    Finished dev [unoptimized + debuginfo] target(s) in 26.49s

Whenever you use the cargo build command, it places the created executable in the target/debug directory with the same name that was defined in the Cargo.toml manifest.

Since it is an executable, simple run the executable from the command terminal.

ArchConfWorkshopUser:~/environment/rust-daas (master) $ ./target/debug/myapp_genesis 
PreviousSection III - executableNextSection V - service testing

Last updated 3 years ago

Was this helpful?