Creative Media Awards

Creative Media Awards

Call for Proposals

Mozilla supports creative artists to examine the importance of responsible design in AI — and how the concept of responsible design applies across different geographies and sociocultural experiences.

info icon

Applications are closed. We are currently conducting a retrospective of the Creative Media Awards program and expect to launch the next Call for Proposals in early 2024.

About Mozilla Creative Media Awards

The Creative Media Awards (CMA) uplift artists and media-makers to investigate complex technical concepts, deepening our understanding of technology while demanding accountability from those who build and deploy it. The Creative Media Awards are part of a comprehensive set of activities supported by the NetGain Partnership, a philanthropic collaboration of ten funders to advance the public interest in the digital age.

For the past four years, Mozilla’s Creative Media Award (CMA) recipients have created interactive experiences, games, videos, and other media that explore artificial intelligence (AI) and its impact—both good and bad. These projects help people understand, imagine, and critique the potentially outsized role technology has in our lives, including what trustworthy AI could and should look like. CMA projects have:

● increased public awareness of the threats and opportunities of AI around the world

● explored, interrogated, and reimagined the role of data and how it may be better stewarded to empower people and communities

● explored current AI realities or imagined AI futures through the lens of Black experience.

At Mozilla, we care deeply about building an internet that is representative of all voices and the CMAs are one way we can realize that ambition.

2023 Creative Media Awards

As AI has become more ubiquitous and an understanding that its design systems are embedded with values increases, calls for “responsible”, “ethical” or “justice-centered” design have also increased. We agree, it is more important than ever to design responsible and trustworthy AI — that is, AI technologies that benefit all humanity (not just the handful who develop the technology) and mitigate possible harms up front and by design. But do we have a shared or even a baseline understanding of what “responsible” design means? How do geography, culture, and history impact our understanding and implementation of responsible or ethical design? When we invoke ethics or responsibility, do we have presumptions of who is being centered as recipient or executor of each?

The 2023 Creative Media Awards invites artists, digital storytellers, game designers, and creative technologists to examine the role, importance, and meaning of responsible, ethical or justice-centered design in trustworthy AI. Awardees are also invited to explore how the concept of responsible design is understood and implemented across different geographies and sociocultural experiences.

This year’s Creative Media Awardees will be convened as a cohort, or community of practice, whose projects collectively highlight the challenges, opportunities, and implications of designing trustworthy AI across different geographies, demographics, and cultures. Also for this year, Creative Media Awards will run in collaboration with Mozilla’s Africa Innovation Mradi program—a program with focus on the development and governance of technologies and the impact this has on social justice issues, both on and offline in the African Continent. At least two slots of awards will be allocated specifically to applicants from Eastern or Southern Africa.

What types of projects are we looking for?

● Projects could be interactive websites, short films, browser-based games, or other digital media that explore specific AI challenges and opportunities, via the lens of responsible, ethical or justice design, pertaining to their geographies, demographics, and cultures

● Projects that explore alternative visions of designing AI technologies while considering the social or political implications of ethical or responsible AI

● Projects that call into question how AI is designed in the corporate or government sectors and uncover values and practices that exacerbate inequalities

● Projects that increase public understanding and awareness of how AI technologies could be designed responsibly and ethically across different geographical contexts and situations.

Award Amounts

The awards will range from $15,000 up to $30,000 each. A budget is required as part of the full application. Final award amounts are at the discretion of award reviewers and Mozilla staff.

Other Benefits We Are Offering

Impact Design Workshops: Working in partnership with Mozilla and our consultant, each Awardee will create an impact plan—outlining a theory of change for each project, as well as accompanying performance indicators

Expert/Mentorship Support: All Awardees will have the opportunity to work with subject matter experts, who can be selected from within Mozilla’s network or recommended by the Awardee

Cohort and Individual Support: All Awardees will either convene virtually and in-person for activities; group working and feedback sessions, peer-to-peer support, learn from guest speakers about relevant topics, and meet 1:1 with Mozilla staff for additional support

Communication Support: Mozilla’s Communications and Marketing Team will evaluate and offer a spectrum of support to Awardees including developing tailored communication strategies for each project, pitching to local and international media outlets, crafting social media campaigns, and sharing with Mozilla’s website audience and email list

Showcasing Projects: Many past Creative Media Awardees have shown their work in a variety of regional and global film festivals (IDFA, Tribeca, TIFF, Cannes) and museums (Tate Modern, MoMA, HKW). While we can’t guarantee placement, we can support your submission efforts. In addition, Mozilla holds an annual festival called MozFest which will provide a chance to showcase your work, meet others in your cohort, along with other artists, activists, thinkers, and makers working in the space from around the world

Project Testing: Awardees will have the option to test their projects before fully launching.

Applicants/Projects must meet the following requirements

● Awards support in-progress works at either the conceptual or prototype stage

● Project must be screen-based media of an artistic or journalistic nature

● Project must be publicly accessible under a Creative Commons license and any project-related code under an open-source license

● Project should be suitable for a non-expert audience and demonstrate the potential to be broadly shared

● Creators must demonstrate an understanding of who their audience is and how they want their project to impact this audience

● Applicants must articulate how the work will help to illuminate an essential shift of power regarding the design and development of trustworthy AI technologies

● Applicants must demonstrate an ability to execute their plan via experience or a prototype

● Applicants who self-identify as any/all the following are encouraged to apply:

  • being from the Global Majority or Global South;
  • Black, Indigenous, and other People of Color;
  • women, transgender and/or gender diverse applicants;
  • migrant and diaspora communities; and/or
  • coming from climate displaced/impacted communities, etc.

● These awards are open to all applicants regardless of geographical location or institutional affiliation, except where legally prohibited

● Grantees must begin working on their projects on or before the week of March 8, 2023.

We do not support:

● Apps exclusively available on proprietary ecosystems (e.g. iOS, Android)

● Productivity tools

● Software libraries

● Works that can only be experienced in person, such as gallery exhibits

Evaluation Criteria

Relevance to funding track: Does the project substantively do any of the following?

  • highlight specific challenges of designing responsible AI technologies
  • propose a new approach to addressing challenges to integrating trustworthy AI into technology practices
  • explore alternative visions to designing trustworthy AI technologies pertaining to their geographies, demographics and cultures
  • serve as a resource to under-represented groups to learn more about how AI technologies should be designed responsibly and ethically
  • explore trustworthy AI related issues in specific geographies, especially from the Global South or Global Majority

Artistic Merit: the project is equal parts creative and interactive, and will increase public understanding of how AI technologies should be developed responsibly

Public engagement potential: the project includes ideas on how to publicize the work to their target audience and/or a plan to grow that community.

Suitability of team: the individual or members of the project team seem suitable for the proposed project and/or plans are in place to pull in outside expertise where needed.

Impact: the project includes a clear understanding of outcomes and the social impact it will have.

Feasibility of budget and timeline: the applicant has a clear project plan in place and the award amount requested is appropriate and will be catalytic for the project. The project can be completed within the ten-month grant period.

Review and Selection

Awardees are selected based on quantitative scoring of their applications by a review committee, and a qualitative discussion at a review committee meeting. Committee members include Mozilla staff, alumni Mozilla Fellows/Awardees, and external experts. Selection criteria are designed to evaluate the merits of the proposed approach. Applicants must submit a full budget that supports the completion and distribution of the proposed project to its intended audience. Diversity in applicant background, past work, and medium are also considered. All registrations and submissions must be in English. If English is not the applicants’ first language, we encourage applicants to use free translation services. The applications will ultimately be judged based on their conceptual strength, not quality of language.

Key Dates and Deadlines

● October 31, 2022: Application opens

● November 08, 2022: Informational Session about the call

● December 9, 2022: Letter of Intent (LOI) Due at 12pm ET/4pm GMT (LOIs are reviewed on a rolling basis and moved into “Full Application” stage if eligible)

● December 27, 2022: Full Application Deadline at 12pm ET/4pm GMT

● January 30, 2023: Awardee notification of decisions

● Week of March 6, 2023: Public awards announcement + grantees begin project work

● March – July 2023: Capacity building workshops and a cohort onsite gathering

● November – December 2023: Public project launches

● January 2024: Final report deadline

If interested in the virtual information session on November 08, 2022 via Zoom, you can sign up here. A recording of the session will be made available on this page and sent to all registered participants the following day.

Mozilla exists to protect and promote the internet as a global public resource, open and accessible to all. As a critical part of this mission, Mozilla invests in the innovators at the frontlines of making the internet more open, inclusive, decentralized, and secure. Through fellowships and awards, we support these leaders and amplify their important work impacting the health of the internet.

Previously Funded Creative Media Awards Projects

Mozilla has supported a variety of Creative Media Awards projects—short videos, browser extension, games, and visualizations. These are examples of the types of projects and makers we wish to support:

Future Wake - an interactive artwork that uses AI trained on real law enforcement data to predict future police killings

Stealing Ur Feelings - Mozilla Awardee Noah Levenson explored emotional sentiment captured through social media filters in this interactive documentary.

Survival of the Best Fit - Mozilla Awardee team from NYU Abu Dhabi created a game that explored hiring bias in automated decision making.

Binary Calculations Are Inadequate - Mozilla Awardee Stephanie Dinkins created an app and workshop series that asks how data-driven algorithms could be more caring.

Dark Matters - Mozilla Awardee Johann Diedrick created an experience that lets you take the seat as a machine learning researcher encountering speech data used to train AI systems similar to those of Alexa, Google Home, and Siri.

For any questions regarding this call for application, please contact [email protected]