Why Decentralized Computing Can Lower the Cost of AI
The exponential advancement in artificial intelligence has led to various tools that make businesses and organizations more efficient. One amazing tool that is making incredible progress is ChatGPT, which can take any organization to a whole new level.
However, despite its promise, many cannot adopt AI because of the cost. To train a (small) million-edge graph, recent works such as NeuGraph and Roc need at least four expensive machines. A public cloud offers flexible pricing options, but cloud GPU instances still incur a non-trivial cost — the lowest-configured p3 instance type on AWS has a price of $3.06/h; training realistic models requires hundreds of such machines to work 24/7, or $300,000/month. While cost is not a concern for big tech firms, it can place a heavy financial burden on small businesses and organizations.
Fortunately, decentralized computing provides a solution to these issues.
Cheaper computing power
In a traditional data center, costs are broken down into servers (30%), housing (12%), networking (15%), AC (21%), power (17%), and people (5%).
Decentralized computing relies on users sharing their resources and contributing computing power in a mutually beneficial manner, reducing the need for expensive housing, networking, AC, power, and people. This provides theoretically 70% cost savings by eliminating the cost of a centralized data center.
There are two major AI training obstacles:
it relies on high-end servers with many GPUs which are expensive to purchase and maintain, and
limited memory on GPUs cannot scale to today’s large models.
Decentralized computing can make AI training more scalable and affordable. The key to making this happen is computation separation, which allows for the construction of a deep, bounded-asynchronous pipeline where parallel tasks can overlap. With the help of thousands of parallel threads with serverless technology, decentralized computing can scale GNN training to billion-edge graphs.
According to recent research from UCLA, decentralized computing can offer up to 2.75x more performance-per-dollar than traditional GPU servers for large graphs. Specifically, decentralized computing can be 1.22x faster and 4.83x cheaper than GPU servers for massive sparse graphs.
Traditional AI solutions require significant investment in software development, infrastructure, and talent. However, decentralized computing networks can allow developers to leverage existing resources and infrastructure, making it easier and more cost-effective to build and deploy AI applications.
Decentralized computing also has the potential to democratize AI development, making it more accessible to a wider range of organizations and individuals. By leveraging a network of decentralized computers, organizations can share computing resources and collaborate on the development of AI solutions. This can reduce the cost of development and accelerate innovation, as more people can contribute to the development of AI.
Affordable and scalable infrastructure for AI
As the field of AI continues to evolve, decentralized computing will play an increasingly important role in its development and widespread adoption. By reducing the cost of training and computing power, decentralized computing has the potential to make AI accessible to a wider range of organizations and individuals, and to drive growth and innovation in numerous industries.
Exabits aims to create a community-driven decentralized computing platform. For small organizations and individuals that have tight cost constraints, Exabits provides an affordable solution by benefitting from decentralized computing power at an extremely low price. For those who need to train large models, e.g., GNNs on very large graphs, Exabits provides a scalable solution that supports fast and accurate AI training on billion-parameter models. Head on over to Exabits and find people who are betting on AI and are already working to make a bright, AI-driven future a reality.