Federated learning is a model training technique that enables devices to learn collaboratively from a shared model. So why is this important? It corrects some of the privacy concerns in centralized machine learning moving forwards.
A great source for AI news is actually Synced. At the Last Futurist we highly recommend it.
In 2017 Google introduced Federated Learning (FL), “a specific category of distributed machine learning approaches which trains machine learning models using decentralized data residing on end devices such as mobile phones.” Now imagine 2017 was a crypto bull market where “decentralized” was on vogue, so a more decentralized AI really excited a lot of people.
So Federated learning is a bit like machine learning training on a decentralized data. The shared model is first trained on a server using proxy data. Each device then downloads the model and improves it using data — federated data — from the device. This is really important for the future of BigData.
We see this in technology when they speak about device centric privacy. The device trains the model with the locally available data. The changes made to the model are summarized as an update that is then sent to the cloud. The training data and individual updates remain on the device.
New advances in privacy are important for machine learning’s evolution and the public’s acceptance of becoming more reliant on AI. Becoming reliant on AI is inevitable for any technological species and is unavoidable.
So how is Federated learning processed? In order to ensure faster uploads of theses updates, the model is compressed using random rotations and quantization. When the devices send their specific models to the server, the models are averaged to obtain a single combined model. This is done for several iterations until a high-quality model is obtained.
Circa 2022 this is becoming more standard. Federated (de-centralized) learning (FL) is an approach that downloads the current model and computes an updated model at the device itself using local data, rather than going to one pool to update the device. This is like Apple’s new emphasis on privacy for consumers enabling more choice in terms of cookie tracking and Facebook following you around constantly or forcing your WhatsApp data to be shared with Facebook.
So with the decentralization of AI, there’s also efficacy gains and not just privacy advantages.
The Decentralization of AI and Node Specific Machine Learning
Compared to centralized machine learning, federated learning has a couple of specific advantages:
- Ensuring privacy, since the data remains on the user’s device.
- Lower latency, because the updated model can be used to make predictions on the user’s device.
- Smarter models, given the collaborative training process.
- Less power consumption, as models are trained on a user’s device.
This is important when dealing with larger data sets of BigData. Federated learning makes it possible for AI algorithms to gain experience from a vast range of data located at different sites. The decentralization is AI is exciting as a real world parallel evolution with decentralized technology.
TensorFlow enables the application of federated learning by leveraging its own framework.
TensorFlow Federated (TFF) is an open-source framework for machine learning and other computations on decentralized data.
Federated Learning is necessary for certain breakthroughs and handling of AI in healthcare when sensitive patient data is involved. A more decentralized AI also enables superior collaboration. Think about it, the approach enables several organizations to collaborate on the development of models, but without needing to directly share sensitive clinical data with each other.
Compared to the centralized training approach, federated learning is a decentralized training approach which enables mobile phones located at different geographical locations to collaboratively learn a machine learning model while keeping all the personal data that may contain private information on device. So Federated learning is an AI architecture breakthrough suitable for a mobile era.
Over the course of several training iterations the shared models get exposed to a significantly wider range of data than what any single organization possesses in-house. This speeds up digital transformation and how BigData can be utilized to improve systems.
Federated learning is therefore not just about privacy and efficacy but about improving collaboration of how innovation takes place based on data-sharing. This opens up new business models and new key areas of where AI can improve society ultimately.
FL allows for machine learning algorithms to gain experience from a broad range of data sets located at different locations. This means that also how machine learning operates can be made to lead to other breakthroughs.
The Federated Learning (FL) API allows developers to apply federated training and evaluation to existing TensorFlow models. For example, a recurrent neural network language model is trained using decentralized on-device datasets improving prediction and the efficacy of predictive analytics.
To be considered an expert in a particular medical field, you generally need to have clocked 15 years on the job. Such an expert has probably read around 15,000 cases in a year, which adds up to around 225,000 over their career. With Federated learning, machine learning models can far exceed the data-sets of even the most experienced experts. Think about what that means for how AI will become our “custodians” if you will, not just in healthcare but in other aspects of our lives and society.