implement http request send#181
Conversation
GitLab Pipeline ActionGeneral informationLink to pipeline: https://gitlab.com/code0-tech/development/taurus/-/pipelines/2479538226 Status: Passed Job summariesdocs:previewDocumentation preview available at https://code0-tech.gitlab.io/-/development/telescopium/-/jobs/14089213288/artifacts/out/index.html |
There was a problem hiding this comment.
Pull request overview
Implements the missing http::request::send runtime function in taurus-core, enabling flows/scripts to perform outbound HTTP requests and receive structured responses.
Changes:
- Added
http::request::sendhandler usingureq, including request header/payload encoding and response decoding. - Added helpers for content-type handling, header conversion, and JSON/text/plain payload mapping.
- Added unit tests covering request payload encoding, header validation, and a TCP roundtrip.
Reviewed changes
Copilot reviewed 3 out of 4 changed files in this pull request and generated 4 comments.
| File | Description |
|---|---|
| crates/taurus-core/src/runtime/functions/http.rs | Adds http::request::send implementation + helpers + tests for encoding/decoding and request/response mapping. |
| crates/taurus-core/Cargo.toml | Adds serde_json and ureq dependencies needed by the new HTTP sender. |
| Cargo.toml | Adds ureq to workspace dependencies. |
| Cargo.lock | Locks new transitive dependencies introduced by ureq. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| let mut fields = HashMap::new(); | ||
| for name in response.headers_names() { | ||
| if let Some(value) = response.header(&name) { | ||
| fields.insert(name, value.to_string().to_value()); |
There was a problem hiding this comment.
decode_headers inserts response header names into the returned Struct without normalizing case. Because struct field lookup is case-sensitive but HTTP headers are case-insensitive, consumers may fail to find headers depending on the server’s header casing / ureq’s internal normalization. Consider lowercasing header names (and/or otherwise normalizing) before inserting them so access is predictable.
| fields.insert(name, value.to_string().to_value()); | |
| fields.insert(name.to_ascii_lowercase(), value.to_string().to_value()); |
| if let Ok(text) = String::from_utf8(bytes.clone()) { | ||
| if content_type | ||
| .as_deref() | ||
| .map(content_type_is_json) | ||
| .unwrap_or(false) | ||
| { | ||
| if let Ok(json) = serde_json::from_str::<JsonValue>(&text) { | ||
| return Ok(from_json_value(json)); | ||
| } | ||
| } | ||
|
|
||
| return Ok(text.to_value()); | ||
| } | ||
|
|
||
| let values: Vec<i64> = bytes.iter().map(|byte| *byte as i64).collect(); | ||
| Ok(values.to_value()) |
There was a problem hiding this comment.
decode_response_payload clones the full response byte buffer just to attempt UTF-8 decoding (String::from_utf8(bytes.clone())). This can double peak memory for large payloads. You can avoid the clone by moving bytes into String::from_utf8(bytes) and, on error, using the returned error’s into_bytes() for the binary fallback.
| if let Ok(text) = String::from_utf8(bytes.clone()) { | |
| if content_type | |
| .as_deref() | |
| .map(content_type_is_json) | |
| .unwrap_or(false) | |
| { | |
| if let Ok(json) = serde_json::from_str::<JsonValue>(&text) { | |
| return Ok(from_json_value(json)); | |
| } | |
| } | |
| return Ok(text.to_value()); | |
| } | |
| let values: Vec<i64> = bytes.iter().map(|byte| *byte as i64).collect(); | |
| Ok(values.to_value()) | |
| match String::from_utf8(bytes) { | |
| Ok(text) => { | |
| if content_type | |
| .as_deref() | |
| .map(content_type_is_json) | |
| .unwrap_or(false) | |
| { | |
| if let Ok(json) = serde_json::from_str::<JsonValue>(&text) { | |
| return Ok(from_json_value(json)); | |
| } | |
| } | |
| Ok(text.to_value()) | |
| } | |
| Err(err) => { | |
| let bytes = err.into_bytes(); | |
| let values: Vec<i64> = bytes.iter().map(|byte| *byte as i64).collect(); | |
| Ok(values.to_value()) | |
| } | |
| } |
| } | ||
|
|
||
| fn content_type_is_json(content_type: &str) -> bool { | ||
| content_type.contains("/json") || content_type.contains("+json") |
There was a problem hiding this comment.
content_type_is_json uses substring checks (contains("/json")), which will classify non-JSON media types like application/jsonp as JSON. This can lead to incorrect request encoding / response decoding. Consider matching on a normalized type and using stricter rules (e.g., exact application/json, or suffix +json, or an explicit /json at the end of the type).
| content_type.contains("/json") || content_type.contains("+json") | |
| let media_type = content_type | |
| .split(';') | |
| .next() | |
| .unwrap_or("") | |
| .trim(); | |
| media_type == "application/json" | |
| || media_type.ends_with("/json") | |
| || media_type.ends_with("+json") |
Resolves: #157