From monthly safety reports to a board-ready HSE package
Ciputra Group manages 87+ projects across 34 cities in Indonesia. Their HSE department used Nodejam to consolidate monthly reporting, build leadership summaries, visualize compliance findings, and assemble a complete board pack within a single workspace, producing 29 files and 7 AI sessions over four weeks.
February 2026 · 6 min read

The challenge
Ciputra Group is one of Indonesia's largest property developers, with more than 87 projects across 34 cities. The group's residential developments include township-scale communities where occupational health and safety is treated as a material operating concern. At the site level, the HSE department manages monthly incident reporting, contractor scoring, non-conformance reviews, and quarterly board reporting for senior leadership.
Before Nodejam, each part of the reporting chain lived in a different application. Monthly HSE reports arrived as separate spreadsheet workbooks, one per month. Contractor scoring forms lived in their own files. Non-conformance records were tracked independently. When it came time to brief a director or prepare board materials, the HSE lead had to manually pull numbers from operational spreadsheets, rewrite them into management-level memos in a word processor, and rebuild the narrative again in a presentation tool. Every reporting cycle meant re-entering the same data across three applications, with every artifact disconnected from its source.
Scattered tools
- Monthly HSE reports as separate spreadsheet workbooks
- Contractor scoring forms disconnected from reporting data
- Non-conformance records tracked independently
- Management memos rewritten from scratch in a word processor
- Board presentations rebuilt manually each quarter from multiple sources
One workspace
- Monthly workbooks consolidated into a single Q1 monitoring spreadsheet
- Scoring, NCR, and reporting data accessible in the same project
- Leadership memos generated directly from operational data
- Director presentations synthesized from existing project files
- All work conducted in Indonesian with no language switching
Use case highlights
From Monthly Reports to a Management Action Plan

The field operator begins by importing three monthly HSE spreadsheets, one for each month of the quarter, along with a separate scoring input spreadsheet into the workspace. From those four sources, Nodejam combines everything into a Monitoring Workbook, a five-sheet quarterly spreadsheet that consolidates incident data, compliance scores, and trend lines across all three months. The department lead then takes that monitoring workbook and asks for a leadership summary. Before writing, Nodejam pauses to ask one clarification question about the target audience, and once the answer is clear, it creates a Management Memo, a text document with ten content sections covering the quarter's HSE performance in language suited for senior leadership. From that memo, it creates a 30-Day Priority Plan, a text document that turns the evaluation findings into a sequenced list of follow-up actions ranked by urgency.
With separate tools, the three monthly spreadsheets would live in a spreadsheet application, the leadership memo would be drafted from scratch in a word processor, and the priority plan would be typed in yet another document with no structural link to the monitoring data behind it. Updating a compliance figure in one of the monthly reports would mean re-opening the rollup spreadsheet, then the memo, then the action plan, each time hoping the numbers stay consistent across applications. Here the entire chain stays in one workspace, so the 30-day priority plan at the end is still directly connected to the monthly HSE spreadsheets it started from.
From Messy NCR and Tender Inputs to Contractor Action

The field operator runs two parallel tracks that converge into a single analysis. On the first track, he imports NCR source data as a spreadsheet and creates NCR Visuals, a slide deck that turns raw non-conformance findings into clear, presentation-ready charts and category breakdowns. On the second track, he imports tender scoring inputs and creates a Scoring Workbook, a structured spreadsheet that organizes contractor evaluations by cluster and assessment criteria. Those two tracks converge when Nodejam combines the NCR visuals and the scoring workbook into a Cluster Analysis, a spreadsheet that maps non-conformance patterns against contractor performance scores so the team can see which clusters need the most attention. From that analysis, it creates a Contractor Follow-Up Tracker, a spreadsheet that assigns specific corrective actions to contractors with deadlines attached, so the workflow ends in accountable follow-through rather than stopping at diagnosis.
In a traditional setup, the NCR source data would live in a spreadsheet application, the visual breakdowns would be rebuilt by hand in a presentation tool, and the tender scoring inputs would sit in a separate file with no connection to either. Building the cluster analysis would mean copying figures from the slides and the scoring file into yet another spreadsheet, then manually constructing the follow-up tracker in a separate workbook. Every step would break the link between a contractor's score and the non-conformance data that justified it. Here all four file types stay in one workspace, so the follow-up actions at the end are still directly tied to the raw NCR data and tender inputs they started from.
From HSE Working Files to a Board Pack

The department lead closes the quarter by pulling together four working files that span three file types. On the spreadsheet side, that means the Monitoring Workbook and the Cluster Analysis. On the text side, that means the Management Memo. On the slides side, that means the NCR Visuals. Nodejam reads across all four and combines them into a single Board Pack. From that pack, it produces a Board Dashboard, a four-sheet spreadsheet surfacing the quarter's key metrics for director-level review. It creates a Board Summary, a text document restating the quarter's HSE position in executive language. It builds a Board Presentation, a five-page slide deck ready for the boardroom. And it generates a Board Action Tracker, a two-sheet spreadsheet logging director-level decisions and their follow-up owners. One request to the workspace produced four coordinated outputs across three file types, each drawing from the same underlying operational data.
This is the step where separate tools break down most visibly. The monitoring workbook and cluster analysis would sit in a spreadsheet application, the management memo in a word processor, and the NCR visuals in a presentation tool. Assembling the board pack would mean opening all three applications, copying data into a new slide deck, rewriting the narrative in a fresh document, rebuilding the dashboard in another spreadsheet, and creating the action tracker in yet another, each time re-entering figures by hand and hoping nothing drifts between files. Every quarterly cycle would repeat that assembly from scratch. Here the department lead makes one request and all four board outputs are generated from the same working files the team already built during the quarter, so every number in the dashboard, every sentence in the summary, and every slide in the presentation traces back to the operational data that produced it.
Usage data
29
Total files
7
Text
18
Spreadsheet
4
Slides
Project Files
Over four weeks the team produced 29 files, with 18 spreadsheets, 7 text documents, and 4 slide decks in the mix. The two roles landed in clearly different places. The HSE team operator lived in spreadsheets, which fits raw monthly consolidation, tender scoring cleanup, and contractor tracking. The HSE department lead kept a balanced mix of text and spreadsheets, and owned most of the slide work for director presentations and the board pack. One workspace covered every file type these two people needed, and neither of them had to leave it for a separate application.
Session Activity
The pattern follows how the department actually worked through the quarter-close cycle. The first week opens on the operator side consolidating three monthly reports into a quarterly monitoring workbook. The second week shifts to the department lead turning that data into a leadership summary after one targeted audience clarification. Mid-month returns to the operator for NCR visuals and tender scoring work. In the fourth week, both roles land on the same day with the director presentation and the contractor follow-up tracker, then the workspace goes quiet for two days before February 27 closes the window with the board pack assembled from the quarter's working files. The shape reads the way ordinary departmental software looks when a real team uses it, not the single-week burst you would expect from a scripted demo.
Agent Execution Reliability
28
tool calls
Completed on first attempt
Actions that went through cleanly with no retry needed.
Auto-recovered via retry
Brief network or upstream hiccups that recovered on their own without the user noticing.
Clarifications requested
A moment where the AI paused to ask a quick question before writing, because the answer would have changed the output.
Across 28 agent actions over four weeks, 89.3% went through on the first attempt with no retry needed. Of the rest, 7.1% were brief network or upstream hiccups that recovered on their own without the user noticing, and the remaining 3.6% was a single moment where the AI paused to ask a clarifying question before writing, because the choice of audience would have changed the output. Every single session ended with a finished file. Asking a clarifying question is intentional behavior, not a stumble. The AI stays autonomous on operational work and only checks in when the answer would change what gets produced.
What we learned
Without training or onboarding, both team members produced role-specific deliverables from day one. The field operator stayed close to raw operational data while the department lead worked at the management and board level. Across the entire program, the AI maintained awareness of every file in the project and asked exactly one clarification, in the session where audience choice would have changed the output.
Safety reporting naturally moves through every file type. A set of monthly spreadsheets becomes a quarterly monitoring workbook, which becomes a leadership memo, which informs a director presentation, which generates a decision tracker for follow-up execution. Non-conformance data needs visual explanation in slides, not just rows in a table. Board materials need a different register than operational notes. In Nodejam, that entire chain stays in one place rather than splitting across three applications.
Across 29 files, 7 AI sessions, and 58 messages, two team members moved from raw monthly data to a finished board package without switching applications. The final session alone read four existing project files and produced four new artifacts spanning spreadsheets, text, and slides, without a single number re-entered by hand.
For a property developer managing safety compliance across 87 projects in 34 cities, that continuity adds up. Each quarterly review, each contractor scoring round, each board meeting builds on files that already exist in the project. The department kept working the way it always had, in Indonesian, with real operational data, just without the re-entry tax of moving between three separate tools.