Frequently Asked Questions
How can I submit or update data for my state?
If you are a member of an ECEP state team, simply follow the instructions in the CMP data template saved in your state’s folder in Google Drive. To update existing dashboard data for your state, including the submission of new data, simply make the necessary changes/additions to the data file (the CMP template) stored in your state’s folder, and then let a member of the ECEP leadership team know that you updated your data.
If you want to submit data but are not a member of an ECEP state team, please email us at ecepalliance@gmail.com.
Can I submit data for a state that is not a member of the ECEP Alliance?
Yes. For more information, please email us at ecepalliance@gmail.com.
How is CS defined in this dashboard? Which courses count as CS?
Earlier iterations of this work attempted to establish consensus among ECEP states regarding which courses count as CS. It was easy to agree on a common conceptual definition of CS education but difficult to agree on the details of which computing and computing-related subfields that definition includes. Thus, rather than imposing restrictions on the data submission process in terms of which courses should and should not be included, this dashboard leaves that decision to the end user. It does this by fitting all courses into a set of categories and allowing the user to select which categories are included and excluded from the visuals and metrics reported here.
How are the course categories defined?
There are 10 categories based on different subfields of CS: AI & Machine Learning, Core Computer Science, Cybersecurity, Data Science, Databases, Information Systems, Information Technology, Networking, Robotics, and Web/Software Development. Each category has its own criterion for whether courses that fall under that subfield should be included in that category. Courses that do not meet the criterion are assigned to a separate CS-related category. For example, courses related to web/software development are assigned to the Web/Software Development category if the focus of the course is on web or software programming and assigned to the CS-Related category if the focus is on web or software design. Each state team categorized all the CS and CS-related courses from their state into one of these 11 categories.
How are the course levels defined?
All courses are designated as either basic or advanced. Basic courses are introductory-level courses that do not require other CS or CS-related courses as prerequisites. Advanced courses require other CS or CS-related courses as prerequisites, require some specialized prior knowledge, or are otherwise substantially more rigorous than basic courses. These definitions allow for some degree of subjectivity when determining whether a course should be designated as basic vs advanced. This was intentional as there are likely to always be some exceptions to any set of purely objective criteria. These definitions are designed to provide sufficient guidance to ensure commonality across states while allowing for some degree of influence from the education experts in each state who are compiling and submitting the data.
What schools and grade levels are included?
The data include all public high schools in each state, where “high school” is defined as any school serving at least one grade from 9th to 12th grade. All counts of students include only 9-12th graders, even for schools that serve additional grade levels (e.g., K-12 schools, 6-12 schools). There are some exceptions to this for the data from some states, and these exceptions are noted within the dashboard. Data from some states may not include every public high school in the state as school district reporting requirements differ by state. For example, school districts in Minnesota are not required to report course enrollment information to the state, so the data for Minnesota in this dashboard only includes schools and students from those districts who reported course enrollment data to the state.
How are course enrollment metrics computed?
In general, course enrollment rates are computed by dividing the number of students enrolled in the selected courses by the total number of students. Some states were unable to provide the number of unique students enrolled and instead provided the total number course enrollments. For these states, the course enrollment metrics may be slightly inflated as students who, for example, enrolled in two relevant courses at the same time may be counted twice. These exceptions to the course enrollment metrics are noted in the relevant places within the dashboard.
How does the dashboard account for data that were suppressed in compliance with state requirements for data privacy?
Some states provided data in which certain values were suppressed to comply with state data privacy requirements. To reduce the amount of missing information in the dashboard, suppressed values were imputed prior to inclusion. For suppression applied to small cell counts, the imputed value was set to the midpoint between 1 and the upper-bound suppression threshold. For example, for states that suppressed counts less than 10, a value of 5 was imputed. To minimize the amount of error introduced by this imputation technique and because most dashboard metrics reported as percentages, imputation was applied only when the margin of error for the relevant metric was two percentage points or less. For example, if the number of Asian students enrolled in CS in a particular school was suppressed because the count was less than 10, a value would be imputed only if the total number of students enrolled in CS at that school was at least 161. This ensures that the reported percentage of Asian students enrolled in CS at that school could not differ from the true value by more than two percentage points. If imputing a value would result in a margin of error greater than two percentage points, the value was set to missing. Similar imputation procedures were followed when other suppression techniques were used.
Relevant Publications
Zarch, R., Dunton, S. T., & Childs, J. (2024, May). From data bonk to data wonk: The value of collaborative exploration of state-based data systems in support of equitable computer science education policy, programs and practices. In Proceedings of the 2024 on RESPECT Annual Conference (pp. 231-235).
Zarch, R., Dunton, S., Warner, J., Xavier, J., Childs, J., & Peterfreund, A. (2023, January). Common metrics: Lessons from building a collaborative process for the examination of state-level K–12 computer science education data. In ASEE Annual Conference and Exposition.
Warner, J. R., Fletcher, C. L., Martin, N. D., & Baker, S. N. (2021). Applying the CAPE framework to measure equity and inform policy in computer science education. Policy Futures in Education, 14782103221074467.
Zarch, R., & Dunton, S. (2022, May). Looking back to move forward: Measuring k-12 computer science education requires an equity-explicit perspective. In 2022 Conference on Research in Equitable and Sustained Participation in Engineering, Computing, and Technology (RESPECT) (pp. 100-104). IEEE.
Dunton, S., Zarch, R., Xavier, J., Warner, J., & Peterfreund, A. (2022). Determining metrics for broadening participation in computing: Connecting data to multi-state computer science education policy efforts. Policy Futures in Education, 14782103211064443.
Warner, J. R., Childs, J., Fletcher, C. L., Martin, N. D., & Kennedy, M. (2021, March). Quantifying disparities in computing education: Access, participation, and intersectionality. In Proceedings of the 52nd ACM Technical Symposium on Computer Science Education (pp. 619-625).
Fletcher, C. L., & Warner, J. R. (2021). CAPE: A framework for assessing equity throughout the computer science education ecosystem. Communications of the ACM, 64(2), 23-25.
Zarch, R., Xavier, J., & Peterfreund, A. (2019, February). Using state-based data systems to support broadening participation in computing. In 2019 Research on Equity and Sustained Participation in Engineering, Computing, and Technology (RESPECT) (pp. 1-1). IEEE.