Shortly after joining Quistor, my colleague Andreea Istrate, host of the JDE Talks webinar, presented me with a high-caliber challenge: a collaboration with Mark Herwege from Oracle.
The goal was to present how to use the Orchestrator Assertion Framework for test automation. I was pleasantly surprised: upon seeing Mark, I realized we had already crossed paths at a previous Oracle Orchestrator Workshop.
The task came with great responsibility. Manual testing was the biggest risk and the primary cause of project delays. We understood that automation was not a luxury but a necessity to accelerate execution, improve coverage, and guarantee reliability after every change. We needed a robust, scalable solution for regression testing.
Imagine the pressure! Newly arrived at the company, my first major task was co-presenting a webinar—in English, a language I’m still refining—alongside one of the people responsible for validating key Orchestrator functionalities before they're shared with all of us. I had a huge challenge to meet, and the vision of total automation was the goal.
I accepted the challenge and fully immersed myself in testing the Assertion Framework.
I remembered that I had already made a post about it before, at the suggestion of a digital plumber and social media friend, Dave Wagoner, so I wasn't starting from scratch. It was powerful, but I soon noted a crucial limitation for massive automation: I could only view the assertion results when running the orchestrators individually from the Orchestrator Studio. If automation involved going one by one, the bottleneck was simply shifted.
I went back to Mark and explained the situation. He revealed the key: the solution was to include an additional parameter in the input JSON. With this small change, the orchestrator would execute, and the output JSON would also include the assertions (the test results). Evaluating Assertions
With this piece of the puzzle solved, the idea was born. Automation had to be massive. I developed a solution to efficiently chain the execution of multiple test orchestrators. By using a configurable CSV file, I could not only call several Orchestrators at once, but they could also pull specific data for their execution and expected assertions from another CSV.
Thus, in the heat of the need for automation and technical innovation, the Assertion Manager was born.
We officially presented this tool in the JDE Talks webinar in February, demonstrating that large-scale test automation using native JDE tools was possible. You can watch the original webinar here: JDE Talks con Mark
Some time later, I saw the call for speakers for the Quest INFOCUS community event. I submitted the Assertion Manager as an educational session to share with the JDE community and was selected, taking this story and its result to what I consider to be the largest JDE event worldwide and there finally can meet Dave in person.

What is the Assertion Manager?
The Assertion Manager is the bridge between individual testing and an automation strategy. It is designed to:
Orchestrate Multiple Tests: Create a list of all the test Orchestrators you've designed and execute them one after the other in the background.
Data and Result Management: It allows you to use configuration files (CSV) to feed each orchestrator with specific test data and compare it against the expected results.
Centralized, Auditable Report: It captures the generated results in a detailed report (CSV). This report shows the expected value, the obtained value, the final result (success/failure), and the execution time.
In summary, moving past manual testing is essential for an efficient and reliable JD Edwards update process. The strategic combination of JDE Orchestrator, its powerful Assertion Framework for defining and validating results, and the Assertion Manager for executing multiple tests and consolidating a report, offers a native and optimized solution.
It's time to work smarter with the right tools. 💡
I'd like to know a bit more about you. Write to me with your ideas and stories you want to share!