Sign up: register@panafrican.email

David v. Goliath: Small language models challenge Big Tech’s AI giants

Community-owned small language model project introduces a framework for incentivized data sharing, aiming to redefine AI development and user interaction.

Assisterr targets the monopolization of AI by Big Tech, advocating for data ownership and the democratization of AI through community-owned small language models (SLMs), which offer tailored and efficient solutions.

The debate on whether artificial intelligence (AI) poses a global threat often misses a crucial point: the real danger lies not in AI itself but in its potential monopolization by the tech giants — commonly known as Big Tech — and governmental bodies. These powerful entities can misuse AI to subtly shape public perceptions and behaviors to serve their own ends, whether for profit maximization or political control.

Far from being a dystopian fantasy, this scenario reflects our current reality, demanding immediate intervention. Data ownership lies at the heart of the issues with AI technology. Big Tech has effectively appropriated the collective knowledge of humanity, training their large language models (LLMs) on free information and then locking it behind $20 monthly subscriptions.

Google’s $60 million annual investment for access to Reddit’s treasure trove of user-generated content underscores the disparity between the value created by community contributions and the compensation (or lack thereof) received by those contributors.

Empowering communities with small language models
Against this backdrop, Assisterr — a Cambridge-based data layer for decentralized AI — positions itself as a force for change by creating an infrastructure that backs decentralized AI data inference and a network of community-owned SLMs, empowering the very people who feed the data ecosystem.

Assisterr provides a data infrastructure layer for small language models. Source: Assisterr

Leave a Reply

Your email address will not be published. Required fields are marked *