OpenConceptLab/ocl_online#89 | Support multiple AI Assistant outputs per input row#21
Conversation
paynejd
left a comment
There was a problem hiding this comment.
Reviewed with @paynejd. Four required changes before merge — inline suggestions cover the MapProject.jsx side; the display change is below.
- Capture
output_localeon every entry — it's part of the request payload and meaningfully affects the response, so it belongs on the saved record and in the attribute strip. - Drop the URI line from the visual display — once the resolved version is shown next to the template key, URI is redundant. Keep
prompt_template_uriin the data model (audit log + JSON viewer still see it). - Read the resolved prompt version back from the invoke response (
response.data.template.version). The intent is to record which version the AI Assistant used to generate the response, not which version (if any) we requested — clients can invoke with no version, let the assistant resolve, and read it back from the response. - Drop store-side dedup entirely — keep the full per-row history of AI responses. Move the "fire once" gate into the auto-match path only: auto-match fires once per row; a user can manually request as many additional AI recommendations as they want, even with the same (model, prompt, locale) combo.
Display side — AICandidatesAnalysis.jsx (outside the diff hunks, so written out here): replace the existing URI block (analysis?.prompt_template_uri && <span>…URI:…</span> around line 128) with an output_locale block:
{
analysis?.output_locale &&
<span style={{marginRight: '4px', display: 'inline-flex'}}>
<Typography gutterBottom sx={{ color: 'text.secondary', fontSize: 12, mb: 0 }} component='span'>
{t('map_project.output_locale')}:
</Typography>
<Typography gutterBottom sx={{ color: 'text.primary', fontSize: 12, mb: 0 }} component='span'>
{analysis.output_locale}
</Typography>
</span>
}Add a map_project.output_locale translation key to en/es/zh alongside it.
Backend verification: please confirm the new array-shaped analysis field round-trips cleanly through save→reload end-to-end. The load path defensively wraps singletons into [v], but I didn't verify the save path or the ocl-online DB column accept the array shape. A quick smoke test (create project → run AI analysis → save → reload → re-run on same row → confirm both entries persist and the pager shows 2/2) before merge would be reassuring.
Linked Issue
Closes OpenConceptLab/ocl_online#89