Skip to content

backend-developers-ltd/apex

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Bittensor SN1: Apex - The Future of Decentralized AI

Discord Chat License: MIT Ask DeepWiki


The Incentivized Internet

Subnet 1 is the most intelligent inference model on Bittensor. As the first agent to achieve deep-researcher reasoning on the protocol, it has long been the network's pioneer of decentralized intelligence. This repository is the official codebase for Bittensor Subnet 1 (SN1) v1.0.0+, which was released on 22nd January 2024.

DiscordNetworkResearch


Our Product Suite: Apex & Beyond

Subnet 1 has several products:

Apex Chat

This is our chat interface which provides a range of high-quality models, allowing users to select the model most appropriate for their own use case. Apex offers five distinct variations:

  • Basic - our fastest model, perfect for conversations
  • Combined - for results from multiple resources, for well-rounded responses
  • Web-enhanced - for results driven by current online results
  • Deep Researcher - for in-depth responses with individual steps
  • Reasoning - for higher quality answers with advanced results

Try the Apex Chat app here: app.macrocosmos.ai/apex

For a more detailed read, check our docs here: docs.macrocosmos.ai/subnets/subnet-1-apex

You can also access Subnet 1, Apex via the API. Find out more here: docs.macrocosmos.ai/developers/api-documentation/sn1-apex

Mission Commander

This is an agentic LLM chatbot built into Gravity, designed to help you pick the right terms and phrases for your data-scraping needs. Simply tell it what information you want and it'll offer suggestions and help with brainstorming. Mission Commander is built with Subnet 1, Apex, also owned by Macrocosmos. It lowers the barrier to entry even further.

Try Mission Commander via Gravity here: app.macrocosmos.ai/gravity

For a more detailed read, check our docs here: docs.macrocosmos.ai/constellation-user-guides/gravity

MCP (Macrocosmos Connect Protocol)

You can integrate Subnet 1, Apex, directly into Claude and Cursor via our MCP. This allows you to access our web-search options and inference via other routes, rather than only from our website. It will provide a URL to Apex, an API key, and a guide on how to use the model.

Try the MCP by following our guide here: docs.macrocosmos.ai/developers/tools/macrocosmos-mcp

Productizing other subnets with Apex

Apex is also powering inference across Bittensor. It's integrated with Squad.ai, Rayon Labs' agent platform built on SN64, Chutes - which means that Apex provides the reasoning abilities for agents built on Squad, boosting their capabilities and adding value to Rayon Labs through its cost-effective efficiencies. In other words, Apex provides more agentic opportunities to Bittensor.

Market Opportunity & Traction

Subnet 1 provides access to decentralized, competitive intelligence. As its speed of inference continues to rise, and miner efficiencies compound, Apex is able to provide fast, flexible, and efficient inference at scale. With the market for AI applications only growing, the ability to provide fast, efficient inference while keeping cost per token low will become increasingly desirable, putting Apex in a strong position to benefit - and making it a strong investment proposition.

By combining real‑time web search with LLM‑grade synthesis in a single, developer‑friendly endpoint, the service sits squarely at the intersection of two explosive trends — AI‑native application stacks and the API‑first economy. Demand spans customer-facing chatbots, autonomous agents, research copilots, and data‑driven decision dashboards, giving the platform both horizontal reach and strong pricing power. Early adopters report dramatic boosts in user engagement and analyst productivity, and investors view the recurring‑revenue potential as "SaaS‑plus," since the engine continuously improves with every query. In short, the product is meeting a surging, underserved need at precisely the right moment, positioning it for rapid, sustainable growth.

Indicators that this subnet's value will rise

Subnet 1 provides access to decentralized, competitive intelligence. As its speed of inference continues to rise, and miner efficiencies compound, Apex is able to provide fast, flexible, and efficient inference at scale. With the market for AI applications only growing, the ability to provide fast, efficient inference while keeping cost per token low will become increasingly desirable, putting Apex in a strong position to benefit - and making it a strong investment proposition.

By combining real‑time web search with LLM‑grade synthesis in a single, developer‑friendly endpoint, the service sits squarely at the intersection of two explosive trends — AI‑native application stacks and the API‑first economy. Demand spans customer-facing chatbots, autonomous agents, research copilots, and data‑driven decision dashboards, giving the platform both horizontal reach and strong pricing power. Early adopters report dramatic boosts in user engagement and analyst productivity, and investors view the recurring‑revenue potential as “SaaS‑plus,” since the engine continuously improves with every query. In short, the product is meeting a surging, underserved need at precisely the right moment, positioning it for rapid, sustainable growth.

Long-term value potential

Thanks to its distributed architecture, Apex has the potential to provide inference at a scale, speed, and efficiency outmatching centralized competitors.

Moreover, many of the most valuable use cases for AI - including law, finance, and health - are so sensitive to data privacy that they cannot use centralized models. This creates a huge opportunity for decentralized competitors to provide decentralized alternatives that provide intelligence without amassing centralized data. As the most intelligent model on Bittensor, itself the foremost DeAI protocol on-chain, Apex is very well positioned to take advantage of this opportunity. For those who believe in DeAI, an investment in Subnet 1 represents a strong prospect.

Interest from customers and users

Subnet 1, Apex has already built up strong support and positive user feedback. It consistently ranks in the top 10 subnets. It displays deep and broad market support. Along with Subnet 13, another Macrocosmos subnet, Apex was the only subnet to be available on Squad.ai, Rayon Labs' agent platform built on Subnet 64, Chutes (itself the top-performing subnet on Bittensor) - showing its reliability, value, and desirability across Bittensor.

Key Use Cases & Potential

Subnet 1, Apex, has a range of use-cases:

  • Agentic chat interfaces, designed to plug into other subnets or commercial tools.
  • Fast and very cheap web search enhanced LLM inference.
  • Deep research and reasoning models that can tackle complex mathematical and logic-based problems.

Apex has the potential to become the flagship decentralized LLM experience across the tech world. By utilizing Bittensor's architecture, we offer speedy and low-cost inference that could soon rival SOTA models in the industry.

The Team Behind Subnet 1

Subnet 1 was built by Dr. Steffen Cruz, AKA @Macrocrux, when he was CTO of Bittensor. Steffen has led Apex through multiple iterations, overseeing its evolution into Bittensor's premier provider of decentralized intelligence.

Apex's engineering team is one of the most impressive on Bittensor. It includes Felix Quinque, who led its Chain of Thought, Reasoning, and Logits upgrades, Dmytro Bobrenko with Organic Scoring and DeepResearcher, Rich Wardle's research and development, and Kalei Brady, who led GAN based architecture upgrade and leads SN1's Discord Community. It also receives the support of other Macrocosmos engineers, ensuring that Subnet 1 is one of the best-staffed projects on the protocol - all of which helps ensure its long-term viability.

Why this team is right for the project

Steffen Cruz designed Subnet 1 while CTO of Bittensor. His continued stewardship ensures ongoing retained knowledge, accrued from years of experience building on Bittensor and a deep understanding of how best to convene state-of-the-art intelligence on chain.

Will Squires, CEO and co-founder of Macrocosmos, first began working with Jacob Steeves (@Const) during Bittensor's Revolution upgrade in October 2023, meaning that Will contributed to the release enabling multiple subnets. Will's strong commercial background leading large multinational companies, combined with his experience in creating and leading AI start-up teams, ensure that Subnet 1, Apex has a strong leadership capable of both delivering innovation and establishing commercially viable partnerships.

Technical Foundation (For Developers)

This repo defines an incentive mechanism to create a distributed conversational AI for Subnet 1 (SN1). Validators and miners are based on large language models (LLM). The validation process uses internet-scale datasets and goal-driven behaviour to drive human-like conversations. To learn more about the Bittensor project and the underlying mechanics, read here..

Agentic Tasks

Subnet one utilizes the concept of "Tasks" to control the behavior of miners. Validator create a variety of tasks, which include a "challenge" for the miner to solve, and sends them to 100 miners, scoring all the completions they send back.

Task Descriptions

  1. Inference A question is given with some pre-seeded information and a random seed. The miner must perform an inference based on this information to provide the correct answer. Completions are scored based on similarity metrics.

  2. Multistep Reasoning (MSRv2) This task operates in two stages: generative and discriminative. In the generative stage, a single miner receives a challenge and generates a response. In the discriminative stage, this generated response (or sometimes a validator-provided "real" answer) is presented to a set of discriminator miners. These discriminators must output a score (0-1) assessing the answer. Rewards are then calculated: discriminators are rewarded based on how accurately their score reflects the ground truth (i.e., whether the answer was miner-generated or real). The original generator miner is rewarded based on the collective assessment of the discriminators. If a "real" answer was used, this portion of the reward is distributed among other non-discriminating miners.

  3. Web Retrieval The miner is given a question based on a random web page and must return a scraped website that contains the answer. This requires searching the web to locate the most accurate and reliable source to provide the answer. The miner is scored based on the embedding similarity between the answer it returns and the original website that the validator generated the reference from.

Dashboards, Tools & Essential Resources

General Resources

  • Website: macrocosmos.ai/sn1
  • Apex Chat App: app.macrocosmos.ai/apex
  • SN1 Dashboard: macrocosmos.ai/sn1/dashboard
    • The Bittensor Subnet 1 Apex Dashboard gives builders, miners and delegators an instant, data-rich window into the network's beating heart: 30-day productivity charts and a real-time leaderboard expose each miner's incentive, emission and trust scores, letting anyone spot top performers and route TAO delegations with confidence.
    • The same asymmetric evaluation pipeline that powers those metrics continuously suppresses hallucinations and, in the process, produces millions of high-quality tokens per day that flow straight into Subnet 37's fine-tuning stream—so the dashboard doubles as a live tap of fresh open-source training data; and because it links directly to Discord, GitHub and extensive developer docs, newcomers can jump from insight to implementation in a click, fuelling the virtuous cycle of research that has already made SN1 the flagship arena for agentic workflow innovation across the entire Bittensor protocol Macrocosmos.
  • GitHub: github.com/macrocosm-os/apex
  • Technical Documentation: docs.macrocosmos.ai/subnets/subnet-1-apex
  • MCP Guide: docs.macrocosmos.ai/developers/tools/macrocosmos-mcp
  • Mission Commander (Gravity) Docs: docs.macrocosmos.ai/constellation-user-guides/gravity

R&D and Community


License: MIT

About

SN1: An incentive mechanism for internet-scale conversational intelligence

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Jupyter Notebook 59.7%
  • Python 39.0%
  • Shell 1.2%
  • Other 0.1%