Raven Distribution Framework(RDF) is our suite of libraries to train machine learning/deep learning models in a decentralized and distributed manner. It can be used to perform statistical operations also. Most importantly, it facilitates the faster and cheaper training of ML/DL models on browser nodes. With RDF, we aim to build an ecosystem and accelerate the development of the fundamentals.
Let’s explore the libraries:
RavCom is a common library that contains various methods to interact with the databases like MySQL, Redis, PostgreSQL, and others. It is a common library used in most of our libraries.
Op is the fundamental…
Generative Adversarial Networks or GANs for short were successful in generating high-quality images, videos, and audio. We have seen various use cases of GANs. Some of the popular GAN networks are BigGAN, StyleGAN, GameGAN, and PGGAN. They have widespread adoption in the industry and academia. But, evaluation of generated samples can be very tricky and prone to errors if done subjectively by humans. IS was introduced to overcome this problem.
The Inception Score(IS) is an objective performance metric, used to evaluate the quality of generated images or synthetic images, generated by Generative Adversarial Networks(GANs). It measures how realistic and diverse…
Generative Adversarial Networks(GANs) are very difficult to evaluate as compared to other networks. And, it is very important to evaluate the quality of GANs, because it can help us in choosing the right model, or when to stop the training, or how to improve the model. Out of several methods, Frechlet Inception Distance(FID) is one performance metric to evaluate the quality of GANs.
5 years back, Generative Adversarial Networks(GANs) started a revolution in deep learning. This revolution has produced some major technological breakthroughs. Generative Adversarial Networks were introduced by Ian Goodfellow and others in the paper titled “Generative Adversarial Networks” — https://arxiv.org/abs/1406.2661. Academia accepted GANs with open hands and industry welcomed GANs with much fanfare too. The rise of GANs was inevitable.
First, the best thing about GANs is their nature of learning, which is unsupervised. GANs don’t need labeled data, which makes GANs powerful as the boring work of data labeling is not required.
Second, the potential use-cases of GANs have put…
We all agree on one thing that Backpropagation is a revolutionary learning algorithm. For sure, it has helped us in the training of almost all neural network architectures. With the help of GPUs, backpropagation has reduced months of training time to hours/days of training time. It has allowed efficient training of neural networks.
I think of two reasons because of which it has gotten this widespread adoption, (1) we didn’t have anything better than backpropagation, & (2) it worked. Backpropagation is based on the chain rule of differentiation.
The problem lies in the implementation of the Backpropagation algorithm itself. To…
Regularization in Machine Learning is an important concept and it solves the overfitting problem. It is very important to understand regularization to train a good model. Sometimes one resource is not enough to get you a good understanding of a concept. I have learnt regularization from different sources and I feel learning from different sources is very important. An easy and simple explanation is what everyone needs. I am listing 2 Quora answers and 5 articles, I hope, these will help.
Neural style transfer and deep photo style transfer are interesting fields of deep learning. Their popularity has grown to an another level. Apps like Prisma and Deepart.io accelerated the popularity. If you are working with neural style transfer or deep photo style transfer these are some very important resources(papers, implementations and tutorials) to help you out.
Machine learning is complex. For newbies, starting to learn machine learning can be painful if they don’t have right resources to learn from. Most of the machine learning libraries are difficult to understand and learning curve can be a bit frustrating. I am creating a repository on Github(cheatsheets-ai) containing cheatsheets for different machine learning frameworks, gathered from different sources. Do visit the Github repository, also, contribute cheat sheets if you have any. Thanks.
List of Cheatsheets:
7. Neural Networks Zoo
10. R Studio
11. Jupyter Notebook
Co-founder - Mate Labs | Co-founder - Raven Protocol | Author - Generative Adversarial Networks Projects | Democratizing Artificial Intelligence