Hello! What an interesting and useful project of study! I have some questions:
1) I am not an expert in machine learning. Can you explain certain of the terms in a non-specialist way? Specifically, what are epochs, learning rates, and layers?
2) How large were the data sets that you used for training, and then for testing?
3) I know that there are a variety of machine learning models, most of which have been given names. However, it seems that you have created an original, new model of your own; is that right? (If not, can you say more about which of the known models you did use?)
The size of our data varies depending on the parameters of training for our model. This means we have to consider the number of epochs, the learning rate, and the number of hidden layers.
Yes, we created a new model of our own. Based on our project when testing the effectiveness of AI technology on SARMs in terms of machine learning, we basically grabbed the molecular encoders from DeepChem libraries and essentially created a new model of our own by teaching the computer to be able to interpret these chemical structures. We have a pretty solid procedure on how we created our model. This includes model initialization, training, prediction, and screening, and so on.
Hello! Thank you for showing interest in our project of study.
Epochs are the number of passes of the entire training dataset the machine learning algorithm has done. The learning rate controls how quickly the model is adapted to solving problems. A layer contains received weighted input and transforms it with a set of mostly non-linear functions, passing these values as output to the next layer. Our data set could not be as large as ideal because of the lack of resources. We did utilize databases with known proteins and molecules such as, Machine learning is an application of artificial intelligence which provides systems the ability to automatically learn and refine their skills through repeated experience. This allows them to work without being programmed to do so.