This is a Q&A with Amarela, a Mozilla Fellow in the Tech and Society Fellowship program.

Amarela is a researcher and communicator who runs digital care training programs, with a focus on educating social justice activists about online security and privacy. Before becoming a Mozilla Fellow, he worked on holistic security and researched the implications of Brazil’s expanding digital surveillance programs. Amarela co-organizes the CriptoFunk festival in Rio De Janeiro, an annual event that cultivates diverse conversations about technology’s effects on the body with the motto “encrypt data, decrypt the body.”

We connected with Amarela to discuss his personal connections to digital care, his most significant challenges and successes, and how he is fighting against damaging stereotypes surrounding women working in technology.

1. Part of your Mozilla fellowship work involves teaching social activists about data collection and surveillance. What’s some of your most important advice in these areas?

I believe my advice is more political than technological.

People tend to see digital security as a series of restrictions, a series of actions that they need to do, or not do, during a certain period of time, usually after an attack or a threat. People deal with it as if it were a diet. What I try to do is present another perspective, closer to food reeducation than a diet: a change in culture, a more lasting process.

Digital technologies are part of our daily lives and what happens to our data also impacts our body. Online and offline are inseparable. Taking care of our online data should be a self-care strategy. Doing physical exercise, eating well, hand-washing, backing up files, managing passwords, and keeping anti-virus software up to date are actions that need to be part of our daily lives.

Digital technologies are part of our daily lives and what happens to our data also impacts our body. Online and offline are inseparable.

Amarela, Mozilla Fellow

Another issue is politicizing choices. For instance, choosing an email provider isn’t just a matter of practicality. The provider's business model, privacy and security policies must come into play, especially for human rights organizations. If you fight for rights, you don't want to put your data in the hands of companies that are not guided by human rights principles, right?

2. What is your personal connection to your research area? Why is it important to you to do this specific work at this moment?

I grew up on the Internet. It was through it that I made cherished friends, that I was able to exercise my subjectivity, that I could experiment with taking on different personas. It was also where I got to know (cyber)feminism, and activist groups like Indymedia, the free software movement. When I started to feel that the Internet was no longer such safe territory for me as an activist and LGBT person, I started trying to understand how it worked, both politically and technically. At that moment I started working with digital security.

Today, even though people use the Internet on a daily basis, most don’t know how it works and don’t know the political arrangements that support it. They don’t know the political choices that, by shaping the Internet, also shape our lives. My work today consists of facilitating conversations about these issues with activist groups, human rights organizations, and social movements. I’m attempting to enable a transformation in the way these groups relate to technologies, and facilitate discussions about how technologies should affect society.

3. How would you describe digital care training to someone who is unfamiliar with it?

I call “digital care” a way of approaching digital security from the perspective of everyday care. Digital technologies are part of our daily lives, and online and offline events affect our lives equally, so taking care of our data needs to be part of our care routine and habits.

My digital care work is based on this perspective and generally includes content on how the internet works (its political and technical aspects), how to create and take care of passwords, how to use messaging applications more securely, privacy and security on social networks, security of devices, encryption of files, disks and messages, risk analysis, anonymity, identity management, etc.

Nowadays, mental health, time management, the relationship between work and the Internet, the need to create agreements regarding availability, and participation in message groups are also part of the discussions.

4. What do you think is the impact of your work in both your own community, and further afield?

Perhaps the greatest impact is the popularization of the topic of digital care among organizations that fight for social justice. I’ve always worked to reveal the power relations that exist in the construction and use of technologies, and to explain the importance of critical thinking and autonomy in the use of technology. I hope I’ve helped organizations to participate in the broader debate about how technologies should affect society.

I believe my work has also been useful for strengthening the digital care community in Brazil, with the organization of events such as Criptofunk, and the focus of my research "Digital Care and Philanthropy: Findings and Basic Recommendations."

5. What are some of your most significant challenges in your work and research?

One challenge has been to help organizations internalize and maintain the digital care debate and practices institutionally. Although most human rights organizations in Brazil understand the importance of such practices, they are unable to prioritize the agenda and implement institutional policies.

Given this context, I’ve been trying to research the digital care ecosystem in Brazil to find ways to bring together sectors that can create joint strategies to address the problem, such as philanthropic organizations, digital care professionals, groups that develop digital infrastructure services, and the human rights organizations themselves.

6. Who is doing work in your area that you admire, and why? Are there any specific success stories, case studies, campaigns, or models that provide inspiration for your work?

When I started working with digital care, Coding Rights and Derechos Digitales were organizations that inspired me a lot for the way in which they managed to communicate issues about digital rights in a light, fun and captivating way, at a time when the subject seemed arid and far from people's interest. Works such as Chupa Dados and the campaign on anonymity are good examples. These two organizations continue to produce works that are references for the field.

Another group I admire is the Transfeminist Digital Care Network in Brazil. The way the Network is organized is very inspiring - the work is horizontal and collective, and based on care and affection. In addition, the Network is made up of an increasingly diverse group of people, who together have been producing materials such as Prato do Dia which includes tips on digital care drawing a parallel between eating habits and digital habits, between food sovereignty and technological sovereignty.

In terms of pedagogy, it’s worth mentioning Escola de Ativismo in Brazil, and the Detroit Community Technology Project in the United States. Both groups work with education and technology based on popular education and community learning practices.

Artist Zach Blas and one of his projects that I recently discovered called Queer Technologies (QT) has been a huge source of inspiration. QT is an artistic project developed between 2008 and 2012 that “by re-imaging a technology designed for queer use, critiques the heteronormative, capitalist, militarized underpinnings of technological architectures, design, and functionality.”

7. You’ve said “There is a parallel between the feminist phrase ‘my body, my rights’ and ‘my data, my rights’ that is worth acknowledging. The offline and online worlds are not separate. When you are attacked online, you are attacked in real life.” With how integral digital spaces and experiences are to many people’s lives, why do you think this perception that it’s somehow more acceptable to attack someone online still persists? What work are you doing to try to change this perception?

I’m not sure if there's a perception that it's more acceptable. What I see is a perception that it’s easier to perpetrate online violence than physical violence. People think they won't be found out and punished. But there are several cases in Brazil that show the opposite. In addition, many victims of online violence (particularly political violence) are looking for ways to identify and hold their perpetrators accountable. An accountability movement is very important and has the potential to stop the wave of attacks, educating the population and discouraging possible aggressors.

But my work has more to do with victims and potential victims. I work to help people protect themselves, to use tools, techniques, and strategies that enable a safer and healthier use of technologies.

8. You’ve spoken about the damaging misconception that women are less skilled in technology and data. What work have you been doing to try to correct this mentality?

I don't believe it's a mistake. It’s an intentional, sexist narrative that’s meant to separate women from technologies - from their use, creation, and governance, since these spaces have become places of power. Unveiling this narrative and providing spaces for debate about the political nature of technologies is what I’ve been trying to do.

I believe the feminist character of my work is potent, but it happens in a subtle way. It’s present in the way I construct the learning methodologies, the training script, and in the language I use.

The field of digital security was established from militarized narratives and methodologies, and that are historically developed, even within the activist field, by white men from the Global North. Such narratives and practices feed (or produce), as consequence (or strategy), fear: this feeling that, ironically, is the main characteristic of the surveillance industry and authoritarian regimes.

In my work I try to welcome fear, not feed it. I try to work with affection as the guiding principle of learning, betting on it as a powerful way to structure exchanges and provide transformations.

9. How has this Mozilla fellowship changed or affected your research and work? Can you describe how you’ve collaborated with your host organization, FASE?

With the fellowship, I was able to develop an organizational digital security process with FASE. FASE is a 60-year-old organization, with more than 90 employees, 6 offices in different regions of the country, and an organization that works with 4 different causes: the right to the city, environmental justice, women's rights, and food sovereignty. As it’s such a large and diverse organization, it was a very challenging job, especially amid the pandemic’s social isolation. The process required intense work to build trust with the team, so I tried to use methodologies and a language that made sense for the organization, with images and symbols the teams shared.

Being able to dedicate myself for so long (two years) to a single organization made me realize the various challenges an organization goes through in the process of internalizing digital care practices. This awakened in me the desire to research the ecosystem of digital care in Brazil and produce/publish information that could help the field to develop.

The fellowship allowed me not only to work with an organization in a very specific context, but also to look at the bigger picture and to help develop the digital care ecosystem in Brazil.

10. You also work on interesting, collaborative events and projects like Criptofunk and your zine “To Be a Monster.” Can you describe the work you do for these outlets and why it’s important to you?

Criptofunk is a festival about technology and society organized by a collective of independent activists and Brazilian organizations including data_labe, Olabi, and Observatório de Favelas and Intervozes. The event is held in Favela da Maré, in Rio de Janeiro, and has the motto “encrypt data, decrypt the body.” It’s an event that, in addition to placing the body at the center of the debate on technologies, also proposes thinking about technologies from a favela territory. As a result, Criptofunk hosts sessions on funk, ancestral technologies, freedom of expression, privacy, surveillance, well-being and “Buen Vivir”, favela technologies, and cryptography.

Criptofunk includes a range of perspectives and experiences, which makes for diverse sessions and audiences. It’s a very important event for me, and I believe for all people who work with digital care, because it’s a space that promotes the exchange of experiences and a certain renewal of perspective. It also sustains and renews community bonds, and attracts new people. I also love Criptofunk because it points to creative, ancient, and pleasant ways out of the dilemmas we face in our daily relationships with digital technologies.

To be a Monster is a fanzine about the relationship between privacy and visibility in the context of LGBTQIA+ experiences. It explores the concept of “identity management” (present in the universe of digital security) as a defense tactic, but also as a way of exercising our subjectivity and experiencing otherness: being another person. It’s an exercise that’s contained in our LGBT experience, with our transformations and trans-identities, and that can be enhanced in relation to digital care.

To Be a Monster is especially important to me because it treats digital care as both a practice that can be playful and a tool that can expand life possibilities. That’s very different from how it’s usually seen, as something that restricts or prevents the flow of everyday activities.