The Dark Side of AI, AI Patents & AI Index Report -- 🦛 💌 Hippogram #7

The 7th edition of the Hippogram focuses on AI patents and open access to knowledge.

The Dark Side of AI, AI Patents & AI Index Report -- 🦛 💌 Hippogram #7
Subscribe I Feedback

Welcome to our newsletter for health and tech professionals - the weekly Hippogram.

I'm Bart de Witte, and I've been part of the health technology community and industry for more than 20 years as a social entrepreneur. In that time, the evolution of technologies changed the face of healthcare, business models and culture in many inspiring but unexpected ways.‌‌

This newsletter wants to share knowledge and insights. This is the heart of the Hippo AI Foundation, named after Hippocrates. Know-How will increasingly result from our data, so it's crucial to share it in our digital health systems. We believe that building more equitable and sustainable global digital health will benefit everyone.

I'm thrilled that Hippogram is getting recommended by our readers and that we have readers from 12 different countries. Want to read the whole newsletter? Sign up here for the entire Hippo experience.

Patents, Patents everywhere but not a single drop drink

The recently published 2022 AI Index Report by Stanford Institute for Human-Centered Artificial Intelligence showed that the number of patent applications in 2021 would be more than 30 times higher than in 2015, with a compound annual growth rate of 76.9%.

Source: HAII AI Index Report 2022 - Chapter Research and Development

This exponential increase means that the solution to demand more transparency is further away than ever before. Why? Keeping their AI models hidden will lower the risk of patent infringements, something we could witness in the Netflix mini-series The Billion Dollar Code, where Joachim Sauter, a media artist who helped develop "Terra Vision" in the early 1990s and who actually went to court against Google. This rapid increase in AI patents filings will be causing more difficulties than it is solving. Let me explain;

Power Asymmetries and the dark age of AI

In 2020 a Berlin-based NGO called, AlgorithmWatch launched a project to monitor Meta's Instagram newsfeed algorithm. Only by understanding how society is affected by platforms' algorithmic decisions can we take action to ensure they don't undermine us. Undermine our autonomy, freedom, and common good. Volunteers could install a browser add-on that scraped their newsfeeds. Data was sent to a database used to study how Instagram prioritizes pictures and videos in a user's timeline. From the collected data, the research team wanted to understand which images and videos are being favoured by the algorithms of Meta. In spring 2021, Meta threatened AlgorithmWatch with legal action if the company continued its data donation project. Algorithmwatch reported that Meta had accused it of violating its terms of use, prohibiting the automatic collection of data. Faced with Facebook's threat to take "more formal action," the NGO terminated the project.