Hello! What an interesting and useful project of study! I have some questions:
1) I am not an expert in machine learning. Can you explain certain of the terms in a non-specialist way? Specifically, what are epochs, learning rates, and layers?
2) How large were the data sets that you used for training, and then for testing?
3) I know that there are a variety of machine learning models, most of which have been given names. However, it seems that you have created an original, new model of your own; is that right? (If not, can you say more about which of the known models you did use?)
The size of our data varies depending on the parameters of training for our model. This means we have to consider the number of epochs, the learning rate, and the number of hidden layers.
Yes, we created a new model of our own. Based on our project when testing the effectiveness of AI technology on SARMs in terms of machine learning, we basically grabbed the molecular encoders from DeepChem libraries and essentially created a new model of our own by teaching the computer to be able to interpret these chemical structures. We have a pretty solid procedure on how we created our model. This includes model initialization, training, prediction, and screening, and so on.