The Challenge: Create and Pilot Ellevation Benchmark Assessment

This week Ellevation celebrated a huge milestone: the first student took the Ellevation Benchmark Assessment (EBA) pilot. In 8 months, we went from a solid but roughly-sketched proof of concept to a suite of English language proficiency assessments for grades 2-12. The story of how we got here exemplifies the importance of steering into risks and delivering value in thin slices.

Mitigating Risks

To hit our deadline, we knew we had to be strategic about what we built and, more importantly, what we didn’t. We began by mapping out the known and unknown risks.

  • The Known High-Scope Risk: A Content Management System (CMS) for housing assessment items would reduce the load for internal assessment specialists but it would also require an impossible amount of work to build in such a short time. Since a CMS is not student- or teacher-facing and has little value for partners, we made a critical decision: we would de-prioritize a full-fledged CMS in our initial build.
  • The Unknown, Riskier Areas: We identified three areas with higher uncertainty and greater potential for unforeseen challenges:
    • A Scoring Process: This is a core component for evaluating productive responses (e.g., spoken or written answers).
    • Integration with Our Existing Product Suite: Administrators and teachers need to be able to assign the assessment, monitor student process, and view scores. Integration needs to be seamless.
    • The Student Player: This is the user-facing interface; it had to be intuitive and reliable.

The Thin Slice Approach: A Spreadsheet Solution

At Ellevation, we believe in steering into risks. We approach the work with several mitigation strategies, including delivering value in thin slices: “build small, demonstrable increments that validate assumptions and reduce the cost of technical surprises.” Our decision to table the CMS development led us to a lean, yet effective, alternative for content management: a spreadsheet-based solution.

Instead of a full web application with a rich user interface, our assessment specialists could manage content—items, stimuli, graphics, and audio files—directly in a structured spreadsheet. Engineers then produced some minimally complex scripts to push this structured data into the student-facing player, the teacher-facing platform, and our internal data warehouse.

This simple approach was a game-changer. It drastically reduced our development time for content tooling while keeping manual data entry work to a minimum, freeing up our resources to tackle the higher-risk areas we’d identified. By focusing on core product functionality from day one, we were able to build a working prototype much faster. Further, this approach allows us to iterate much more easily and at much lower cost. The test design is rooted in our build-to-learn mindset, and our nimble spreadsheet solution allows us to revise and replace test elements without the burden of restructuring or reconfiguring a CMS.

Overcoming Technical Challenges

With the content tooling challenge solved, we were able to focus our energy on the critical components that students and administrators would actually see:

  • Developing the Intuitive Student Player: The player was designed with a clean, distraction-free interface, ensuring a smooth experience for students taking the assessment. We focused on accessibility and performance to support a wide range of devices and students.
  • Identifying a viable scoring process: We created the foundation for a scoring system and have a process that will allow us to score responses, provide domain scores to teachers and administrators, and analyze test and item performance.
  • Seamless Product Suite Integration: We built deep integrations with our existing systems that administrators are already familiar with, flattening their learning curve.

Lessons Learned and Future Directions

This project taught us the immense value of risk prioritization. By confronting our biggest unknowns and de-prioritizing a known, but resource-intensive, task, we were able to deliver a functional product on time.

The process underscored the power of iterative development. We didn’t need to build the perfect tool from the start; we needed to build a working one. The spreadsheet-based content tooling, while not a long-term solution, was an imperfect, but good enough tool for this phase.

During this first round of content ingest, our content team tracked points of friction and identified key areas for improvement. Because of the simplicity of the solution, we were able to make changes relatively quickly and easily, like ensuring formatting in the spreadsheet content was maintained in the player. We also identified steps in the process that were prone to human error and automated or standardized them.

Communication and collaboration also played a crucial role in our success. Talking about risks from all perspectives and working together to prioritize helped ensure that we made the most appropriate tradeoffs. Throughout the pilot window, we will learn more about the test design, item performance, player functionality, and scoring process. Collaboration will help us translate our learnings into improvements.

As we look to the future, we will continue doing what we do best: deeply understanding risk / reward and incrementally investing in all areas of our product, including content tooling. With a strong sense of vision for how a user-friendly, streamlined content management system could take our product to the next level, we’ll continue to navigate tradeoffs with agility.

Joe Gester

Joe is a Staff Software engineer working on Ellevation's Benchmark Assessment and Strategies products. Outside of work, he enjoys cooking for his family, hiking, and ceramics.

Erica Schmid

Erica is the Manager of Education Solutions and works on the team developing the Ellevation Benchmark Assessment. In her spare time, she loves hanging out with family, watching the Phillies and Eagles win, and trying new restaurants.