The Berkeley Beacon releases first diversity audit in the organization’s 75-year history

The+Berkeley+Beacon+releases+first++diversity+audit+in+the+organization%E2%80%99s++75-year+history

Since fall 2020, when The Beacon saw more than a dozen staff resignations as a result of a toxic, racially-insensitive newsroom culture, we have been working to reflect and reexamine The Beacon’s organizational decisions, conduct, and news judgment. Among our initiatives was this content diversity audit. 

Last semester, we conducted a content diversity audit of over one thousand stories and photos published from August 2019 through August 2020, identifying racial disparities between the diversity of our own contributors and that of Emerson’s own student demographics

Although the completion of this audit took much longer than anticipated, we believe it’s important to release the audit’s findings to the Emerson community for the sake of transparency and for comparison with future audits. 

Below is a breakdown of how we completed the audit, a summary of our statistical findings, and the limitations we discovered while completing it.

How we completed the audit

The Beacon’s management team read through every story published between Aug. 28, 2019 to Aug. 31, 2020 and entered its data into the spreadsheet, with the exception of incident journal reports (which are submitted by the Emerson College Police Department, not Beacon staffers.) 

Beacon staffers performing the audit reviewed stories in their entirety, taking note of factors like the apparent race and ethnicity of each story’s writer, as well as that of the sources quoted in the story. They also sought to find the gender of sources quoted in Beacon stories. Additionally, the audit sought to address how frequently persons of color were pictured in Beacon photographs. Aside from quantitative analysis, staffers also had the option to voice specific concerns on content, structure, or reporting.

Once all stories were reviewed, we analyzed the combined results to present this audit. We are grateful for the countless Beacon staffers who helped us sift through hundreds of stories since we began this process. 

Statistical Findings

Nearly 80 percent of the stories audited were written by Beacon staff members, as opposed to correspondents (who are freelance contributors not formally hired by the Beacon.)

Of the 903 stories reviewed by Beacon staff members, 581 were written exclusively by white students—64.5 percent. In this year-long period, writers of color wrote approximately 20 percent of Beacon stories. Yet only 1.4 percent of Beacon stories—thirteen articles—were written by Black students. 

By comparison, Emerson reported that 59 percent of its total undergraduate student population in the fall 2019 semester was white; 4 percent identified as Black or African American, 5 percent Asian, 13 percent Hispanic or Latino, 4 percent two or more races, 13 percent international students, 0.2 percent American Indian/Alaska Native, 0.02 percent Native Hawaiian/Pacific Islander, and 3 percent race or ethnicity unknown. 

 

The data derived from the audit shows there was a disproportionate amount of non-POC or white writers for The Beacon from 2019-20—our contributors were 9.3 percent more white than Emerson’s student population. The data also showed a 60 percent discrepancy between The Beacon’s Black contributors (1.4 percent) and the student population (3.6 percent.) 

It is worth noting that we had a large margin of writers whose race/ethnicity could not be confidently identified (13.8 percent), which likely impacted the accuracy of our findings. The college’s numbers also categorize international students as a separate demographic while the audit did not, which might similarly impact our findings. 

Insofar as gender diversity, the audit revealed that 51.0 percent of sources quoted were male-identifying, while 40.5 percent were female or non-binary-identifying. The remaining 8.47 percent were listed as alternative sources. 

According to the college factbook (which collects data based on legal sex), 61.3 percent of undergrad students at the college in 2019 were women, and 38.7 percent were men. Thus, the audit found that The Beacon quoted 31.8 percent more male sources than there were male students.

 

The examination of Beacon photos, however, revealed that 43.3 percent of photos used in stories featuring people of color while 56.7 percent of photos featured non-POC or white individuals––more accurately reflecting the Emerson community.

We recognize there is no guaranteed way to find definitive answers for all questions included in the audit—which leads us into the limitations that are important to note when analyzing the audit’s findings. 

 

Limitations 

This is the first content audit The Beacon has completed in the almost 75 years since its inception. Our findings have many limitations and areas for improvement, which we will take into consideration while planning future content audits with more conclusive patterns and findings. 

One of the most glaring dilemmas we ran into is the complexity of identifying someone else’s race, ethnicity, or gender. Much of the audit asks questions about the writer or source’s personal identity. In many cases, it was easy to identify a source or contributor’s race or gender—especially if staffers were already familiar with the individual—but we acknowledge that this approach is inherently problematic. 

In most cases, staffers were able to answer all questions on the audit with relative confidence. However, we found that in some instances, there is not always a definitive way to derive someone’s race or gender without being able to reach the person directly. These identities are often complex and personal, and cannot always be identified just based on a photo in a story or social media profile belonging to that person. 

For this reason, we included an option for auditors to mark an individual’s race/ethnicity as “unknown or unsure,” and asked staffers to lean on the side of caution. 

There are also other limitations to the data. We don’t account for disabled students for example, nor does the college in its official report on the demographics of students’ disability status.

We recognize that the data reflected in the audit may not be 100 percent accurate in identifying each individual writer or source’s identity. Despite the lack of certainty, we feel this information is too valuable to discount. 

Going forward, we strive to find better ways to phrase these questions that will yield more accurate results. We also may seek to analyze other factors, such as gender identity (which this first audit only examined for sources, not writers) and nationality. 

Additionally, the length of time it took to complete the audit is a considerable area for improvement. While we were eager to analyze as much of our content as we could in order to produce a comprehensive report, the inclusion of every single story published in the year-long period took months to complete, even with the help of our staff. Future audits should consider limiting the number of stories audited for the sake of producing a report in a timely manner. 

Conclusion

While the results of the audit are disappointing, this process was incredibly useful to gauge where The Beacon is today. The results demonstrate many disparities in all aspects of our content creation. Further analysis will help us examine the magnitude of the institutional changes made since fall 2020, when the last audited stories were published. 

In addition to working as a comparison tool, the errors in the process of this audit will help us formulate more informed questions and processes that will accumulate more accurate results.   

This first audit was a learning experience for The Beacon, and the disparities brought to light in this audit will help inform our next one and the changes we will continue to make in our newsroom. The next audit conducted will be of our most recent content, from fall 2021 and the start of this spring, and will be released after spring break.