How ESHCC uses dashboards to facilitate dialogue about educational quality

Mieke Sillekens en Eva Rouwmaat kijken in de camera.
I meet regularly with analysts from other faculties and the Data Competence Hub to share knowledge and build connections.

Eva Rouwmaat

Data analyst

"Informed dialogue" is one of the five pillars of EUR's quality assurance vision. Since 2023, the Erasmus School of History, Culture and Communication (ESHCC) has supported this dialogue through an innovative data portal featuring Power BI dashboards. We spoke with Eva Rouwmaat (data analyst) and Mieke Sillekens (Education & Policy Manager) about making data accessible, fostering collaboration, and facilitating meaningful conversations about educational quality.

What drove the development of these dashboards?

'When I joined EUR two years ago, there was plenty of information available, but it was scattered and hard to access,' Mieke explains. 'At my previous employer, Power BI was integrated throughout the organization—I could find everything I needed in one place, which was incredibly helpful. I was excited that ESHCC shared the vision of making our existing information more accessible, so we could monitor progress continuously rather than just during major events like accreditations.'

Eva, who also joined EUR around the same time, worked with Mieke and several colleagues to assess what was needed. 'Mieke mapped out her requirements for site visits, various reports, and quality assurance monitoring. I gathered input from across the ESHCC faculty about what they needed in a dashboard. What you see in the portal today is essentially the standard data you'd traditionally find in site visit reports.'

What does the ESHCC data portal include?

The portal lives on a SharePoint page and houses multiple dashboards—both EUR-wide ones from the Data Competence Hub and ESHCC-specific ones. Access is granted by section to staff who work with the relevant data. The portal has three main components:

  • Student Success Dashboard
    This provides insights into student enrollment, BSA results, dropout rates, and graduation rates. Users can slice the data multiple ways: by faculty, department, programme, EEA/Non-EEA status, and study progress (on-time or N+1). An orange line shows the faculty average, giving departments a benchmark for comparison.
  • Admissions Dashboard
    A real-time dashboard that updates daily with new application data by programme. Users can compare year-over-year trends, break down data by EEA/Non-EEA status, and track applications through each stage of the process. This is our most frequently accessed dashboard and is embedded in a Teams channel for easy access.
  • Course Results Dashboard
    A comprehensive view of all courses offered, including enrollment numbers, pass rates, average grades, and the percentage of students earning 8+ grades. When available and response rates are sufficient, course evaluation results are also displayed. This dashboard is updated annually after each academic year concludes.

Who uses the portal and how?

The dashboards serve different user groups. The student success and course results dashboards are accessible to the vice-dean of education, programme directors, education management, and educational policy staff. 'We review these periodically with our three programme directors,' Mieke explains. 'We meet with one director each week, then bring everyone together in the fourth week. These meetings give us regular opportunities to discuss the data. We also use the portal when preparing self-evaluation reports and similar documents.'

The admissions dashboard has much broader usage. 'We've embedded this data in a Teams channel,' Eva notes. 'All academic coordinators can access it, making it really easy to track admissions daily. The dashboard updates every day.'

How involved is student representation?

Student representatives don't currently have direct portal access. 'They receive summary reports from my colleague about evaluations, which include analysis of the results,' Mieke explains. 'It hasn't come up as a request yet, but we could certainly explore giving them access.'

What impact has the dashboard had so far?

'The student success metrics vary significantly across programmes,' Mieke observes. 'We're currently conducting a study feasibility review using administrative agreement funding. We're not planning immediate overhauls, but we want to identify anything that stands out or needs attention.'

Do you systematically connect improvement actions to portal insights?

'We start by discussing notable findings with the relevant programme director,' Mieke explains. 'Sometimes we notice that certain courses have unusually high average grades or pass rates. We review these cases together, and programme directors follow up as needed.'

'What we could improve,' she continues, 'is tracking whether actions are actually taken. Programme directors often discuss concerning patterns with lecturers, but it's not really systematic. At the last Education Workshop, they presented a tool for setting and assigning actions to staff members. That looked promising. Such a tool could be valuable, as long as programme directors and lecturers retain control over the process.'

Has the dashboard improved educational discussions?

'Absolutely,' Mieke responds. 'It gives us information we wouldn't easily have otherwise. It serves as a benchmark. There might be perfectly logical explanations for variations, but it flags areas worth examining. We still need to work on completing the full PDCA cycle, but we've definitely moved in the right direction.'

How do you collaborate with other faculties?

'I meet regularly with analysts from other faculties and the Data Competence Hub to share knowledge and build connections,' Eva explains. 'The admissions dashboard data was a joint project with the Data Competence Hub, ESSB, and ESL. ESHPM also joined and created a similar dashboard, which gave me additional ideas. You learn from each other because this type of data is relevant across faculties. Everyone then customises it for their specific needs. For instance, we have many programmes with admission requirements, while ESL's situation is different.'

How do you monitor strategic goals from your educational vision, like 'impact' and 'sustainability'?

'That's a more complex challenge," Mieke responds. "We investigate these goals qualitatively and less frequently. For example, we surveyed how professional practice is integrated into our courses. Student assistants conducted this exercise by reviewing Canvas course pages.'

She explains why such information isn't in the data portal: 'We can't extract this type of data from existing systems. It involves complex information that isn't separately tracked—like whether courses include guest speakers. We could ask lecturers to register everything so we could capture it in dashboards, but since this information is needed less frequently, I'm not sure the additional workload would be proportional.'

What advice would you share with other faculties?

Eva: 'From a professional standpoint, I think you need to ensure that dashboards, data, and presentation methods align well with faculty meeting structures and discussion goals. Data analysts can be tempted to create insights just because we have the data, but then they often go unused.'

Mieke emphasizes that educational quality data collection shouldn't become counterproductive: 'It's about striking the right balance between gathering the information we need and avoiding excessive workload for academic staff and students.'

'It's about gradually developing more structured approaches without losing the informal elements,' Mieke concludes. 'The direct communication lines we have with programme directors within our faculty are truly invaluable.'

Compare @count study programme

  • @title

    • Duration: @duration
Compare study programmes