Labels Predicted by AI
Federated Learning System Blockchain Technology
Please note that these labels were automatically added by AI. Therefore, they may not be entirely accurate.
For more details, please see the About the Literature Database page.
Abstract
The rising demand for collaborative machine learning and data analytics calls for secure and decentralized data sharing frameworks that balance privacy, trust, and incentives. Existing approaches, including federated learning (FL) and blockchain-based data markets, fall short: FL often depends on trusted aggregators and lacks Byzantine robustness, while blockchain frameworks struggle with computation-intensive training and incentive integration. We present , a decentralized data marketplace that unifies federated learning, blockchain arbitration, and economic incentives into a single framework for privacy-preserving data sharing. enables data buyers to submit bid-based requests via blockchain smart contracts, which manage auctions, escrow, and dispute resolution. Computationally intensive training is delegated to (Compute Network for Execution), an off-chain distributed execution layer. To safeguard against adversarial behavior, integrates a modified YODA protocol with exponentially growing execution sets for resilient consensus, and introduces Corrected OSMD to mitigate malicious or low-quality contributions from sellers. All protocols are incentive-compatible, and our game-theoretic analysis establishes honesty as the dominant strategy. We implement on Ethereum and evaluate it over benchmark datasets – MNIST, Fashion-MNIST, and CIFAR-10 – under varying adversarial settings. achieves up to 99% accuracy on MNIST and 90% on Fashion-MNIST, with less than 3% degradation up to 30% Byzantine nodes, and 56% accuracy on CIFAR-10 despite its complexity. Our results show that ensures privacy, maintains robustness under adversarial conditions, and scales efficiently with the number of participants, making it a practical foundation for real-world decentralized data sharing.
