Перейти к содержимому
UzScite
  • НСИ
    • Новости События
    • Методическая информация
    • Нормативные документы
  • Каталог журналов
  • Указатель авторов
  • Список организаций

Нейронные сети глубокого обучения в моделировании отношения «структура-активность»

Адылова Ф.Т.

Проблемы вычислительной и прикладной математики

  • № 3(21) 2019

Страницы: 

5

 – 

18

Язык: русский

Открыть файл статьи
Открыть страницу статьи в Интернет

Аннотация

Искусственные нейронные сети стали популярны в молекулярной и химической информатике (при разработке лекарств) примерно два десятилетия назад, но за последнее десятилетие нейронные сети глубокого обучения достигли значительных успехов в различных областях применения искусственного интеллекта. Эта технология, выросшая из исследований искусственных нейронных сетей, показала превосходную производительность по сравнению с другими алгоритмами машинного обучения в таких областях, как распознавание изображений и голоса, обработка естественного языка и другие. Первая волна приложений глубокого обучения в фармацевтических исследованиях появилась в последние годы, сегодня она продемонстрировала свою перспективу в решении разнообразных проблем в области открытия лекарств. В работе обсуждаются работы, касающиеся прогнозирования биоактивности по отношению «структура-активность» (QSAR) и решения этих задач в случае небольших объёмов исходных данных.

Artificial neural networks became popular in molecular and chemical informatics (in drug design) about two decades ago, but over the past decade, deep learning neural networks have achieved remarkable success in various areas of artificial intelligence. This technology, which grew out of research on artificial neural networks, has shown excellent performance compared to other machine learning algorithms in areas such as image and voice recognition, natural language processing, and others. The first wave of applications of deep learning in pharmaceutical research has appeared in recent years; today it has demonstrated its perspective in solving various problems in the field of drug discovery. The paper discusses the works related to the prediction of bioactivity based on the quantitative structure-activity relation (QSAR) and the solution of these problems in the case of small amounts of input data.

Список использованных источников

  1. National Security Agency statement. URL: https://www.nsa.gov/news-features/pressroom/statements/2013-08-09-the-nsa-story.shtml.
  2. Gantz J., Reinsel D. Extracting value from chaos. 2011. URL: https://www.emc.com/collateral/analyst-reports/idc-extracting-value-from-chaos-ar.pdf.
  3. Gantz J., Reinsel D. The digital universe decade — are you ready? 2010. URL: https://www. emc.com/collateral/analyst-reports/idc-digital-universe-are-you-ready.pdf.
  4. Howard J. The business impact of deep learning // Proc. of the 19th ACM SIGKDD Int. Conf. on Knowledge Discovery and Data Mining. — 2013. P. 1135. URL: http://www.gartner.com/technology/research/top-10-technology-trends/.
  5. Papadatos G. et al. Activity, assay and target data curation and quality in the ChEMB database // J. Comput., 2015. Vol. 29. P. 885–896. doi: http://dx.doi.org/10.1007/s10822-015-9860-5.
  6. Kim S. et.all. PubChem substance and compound databases // Nucleic Acids Res, 2016. V. 44. P. 1202–1213. doi: http://dx.doi.org/10.1093/nar/gkv951.
  7. Gilson M. K. et all. BindingDB in 2015: a public database for medicinal chemistry, computational chemistry and systems pharmacology // Nucleic Acids Res, 2016. V. 44. P. 1045–1053. doi: http://dx.doi.org/10.1093/nar/gkv1072.
  8. Cortes C. and Vapnik V. Support-vector networks // Mach. Learn, 1995. V. 20. №. P. 273–297. doi: http://dx.doi.org/10.1007/BF00994018.
  9. Salt DW. et. all. The use of artificial neural networks in QSAR // Pesticide Science, 1992. V. 36. №2. P. 161–170. doi: http://dx.doi.org/10.1002/ps.2780360212.
  10. Ho T. K. The random subspace method for constructing decision forests // Pattern Anal. Mach. Intell., 1998. V. 20. №8. P. 832–844. doi: http://dx.doi.org/10.1109/34.709601.
  11. Ammad-Ud-Din M. et. all. Drug response prediction by inferring pathway-response associations with kernelized Bayesian matrix factorization // Bioinformatics, 2016. V. 32. №9. P. 455–463. doi: http://dx.doi.org/10.1093/bioinformatics/btw433..
  12. Ching T. et. all. and obstacles for deep learning in biology and medicine // J. R. Soc. Interface, 2017. V. 15. №4. P. 69–77. doi: http://dx.doi.org/10.1098/rsif.2017.0387.
  13. Goh G. B. et all. Deep learning for computational chemistry // J. Comput. Chem., 2017. V. 38. №3. P. 1291–1307. doi: http://dx.doi.org/10.1002/jcc.24764.
  14. Gawehn E. et. all. Deep learning in drug discovery // Mol. Inform, 2016. V. 35. P. 3—14. doi: http://dx.doi.org/10.1016/j.drudis.2018.01.039.
  15. Mamoshina P. et. all. Applications of deep learning in biomedicine // Mol. Pharm, 2016. V. 13. №5. P. 1445–1454. doi: http://dx.doi.org/10.1021/acs.molpharmaceut.5b00982.
  16. Ekins S. The next era: deep learning in pharmaceutical research // Pharm. Res, 2016. V. 33. №11. P. 2594–2603. doi: http://dx.doi.org/10.1007/s11095-016-2029-7.
  17. Baskin I. I. et. all. A renaissance of neural networks in drug discovery // Expert Opin. Drug Discov., 2016. V. 11. №8. P. 785–795. doi: http://dx.doi.org/10.1080/17460441.2016.1201262..
  18. Goodfellow I., Bengio Y. and Courville A. Deep Learning. – MIT Press, 2016. URL: http://www.deeplearningbook.org.
  19. Srivastava N. et. all. Dropout: a simple way to prevent neural networks from overfitting // Machine Learning Res, 2014. V. 15. P. 1929–1958.
  20. Wan L. et. all. Regularization of neural networks using DropConnect. // In Proc. of the 30th Int. Conf. on Machine Learning, 2013. V. 28. P. 1058–1066.
  21. Nair V. and Hinton G. E. Rectified linear units improve restricted boltzmann machines // In Proc. of the 27th Int. Conf. on Machine Learning, 2010. P. 807—814.
  22. TensorFlow. URL: https://www.tensorflow.org/
  23. Caffe. URL: http://caffe.berkeleyvision.org/
  24. PYTORCH. URL: http://pytorch.org/
  25. Keras. URL: https://keras.io/
  26. Theano. URL: http://deeplearning.net/software/theano/
  27. Lee H. et. all. Unsupervised learning of hierarchical representations with convolutional deep belief networks // Commun. ACM, 2011. V. 54. №10. P. 95–103. doi: http://dx.doi.org/10.1145/2001269.2001295.
  28. Hochreiter S. and Schmidhuber J. Long short-term memory // Neural Comput., 1997. V. 9. №8. V. 1735–1780.
  29. Bengio Y. Learning deep architectures for AI // Found. Trends Mach. Learn, 2009. V. 2. №1. P. 1–127. doi: http://dx.doi.org/10.1561/2200000006.
  30. Kingma D. P. and Welling M. Auto-encoding variationalbayes // 2013. doi: http://dx.doi.org/ArXivdoi:1312.6114.
  31. Breiman L. Random forests // Machine Learning, 2001. V. 45. №1. P. 5–32. doi: http: //dx.doi.org/10.1023/A:1010933404324.
  32. Cortes C., Vapnik V. N. Support-vector networks // Machine Learning, 1995. V. 20. P. 273–297. doi: http://dx.doi.org/10.1007/BF00994018.
  33. Dahl G. E., Jaitly N., Salakhutdinov R. Multi-task neural networks for QSAR predictions // 2014. doi: http://dx.doi.org/http://arxiv.org/abs/1406.1231.
  34. Svetnik V. et. all. Boosting: an ensemble learning tool for compound classification and QSAR modeling // J. Chem. Inf. Comput. Sci., 2005. V. 45. №3. P. 786–799 doi: http://dx.doi.org/10.1021/ci0500379.
  35. Burden F. R. Quantitative structure-activity relationship studies using Gaussian Processes // J. Chem. Inf. Comput. Sci., 2001. V. 41. №3. P. 830–835 doi: http://dx.doi.org/10.1021/ci000459c.
  36. Sheridan R. P. Time-split cross-validation as a method for estimating the goodness of prospective prediction // J. Chem. Inf. Model., 2003. V. 53. №4. P. 783–790. doi: http://dx.doi.org/10.1021/ci400084k.
  37. Carhart R. E.,Smith D. H., Ventkataraghavan R. Atom pairs as molecular features in structure-activity studies: definition and application // J. Chem. Inf. Comput. Sci., 1985. V. 25. №2. P. 64–73.
  38. Kearsley S. K.et.all. Chemical similarity using physiochemical property descriptors // Chem. Inform. Comp. Sci., 1996. V. 36. №1. P. 118–127. doi: http://dx.doi.org/10.1021/ci950274j.
  39. Rumelhart D. E.,Hinton G. E., Williams R. J. Learning representations by backpropagating errors // Nature, 1986. V. 323. P. 533–536. doi: http://dx.doi.org/10.1038/323533a0.
  40. Hinton G. E., Osindero S., Teh Y. W. A fast learning algorithm for deep belief nets // Neural Computation, 2006. V. 18. №7. P. 1527–1554. doi: http://dx.doi.org/10.1162/neco.2006.18.7.1527.
  41. Srivastava N. et. all. Dropout: A simple way to prevent neural networks from overfitting // J. Mach. Learn. Res, 2014. V. 15. №6. P. 1929–1958.
  42. Ma J. et. all. Deep neural nets as a method for quantitative structure-activity relationships // J. Chem. Inf. Model., 2015. V. 55. №2. P. 263–274. doi: http://dx.doi.org/10.1021/ci500747n.
  43. Ramsundar B. et. all. Massively multitask networks for drug discovery. 2015 // doi: http://dx.doi.org/arXivpreprintarXiv:1502.02072.
  44. Unterthiner T. et. all. Deep Learning as an Opportunity in Virtual Screening // Neural Inf. Proc. Sys. DL Workshop, 2014. V. 27.
  45. Lusci A., Pollastri G., Baldi P. Deep architectures and deep learning in chemoinformatics: the prediction of aqueous solubility for drug-like molecules // J. Chem. Inf. Model., 2013. V. 53. №7. P. 1563–1575. doi: http://dx.doi.org/10.1021/ci400187y.
  46. Duvenaud D. K. et. all. Convolutional networks on graphs for learning molecular fingerprints // Neural Inf. Proc. Sys., 2015. P. 2224–2232.
  47. Kearnes S. et. all. Molecular Graph Convolutions: Moving Beyond Fingerprints // J. Comput.-Aided Mol. Des., 2016. V. 30. P. 595–608. doi: http://dx.doi.org/10.1007/s10822-016-9938-8.
  48. Rogers D., Hahn M. Extended-connectivity fingerprints // J. Chem. Inf. Model., 2010. V. 50. №5. P. 742–754. doi: http://dx.doi.org/10.1021/ci100050t.
  49. Santoro A. et. all. One-shot Learning with Memory-Augmented Neural Networks // 2016. doi: http://dx.doi.org/arXivpreprintarXiv:1605.06065.
  50. Vinyals O. et. all. Matching Networks for One Shot Learning // Advances in Neural Information Processing Systems, 2016. P. 3630–3638.
  51. Han Altae-Tran, BharathRamsundar, Aneesh S. Pappu, Vijay Pande Low. Data Drug Discovery with One-Shot Learning // ACS Cent. Sci., 2017. V. 3. №4. P. 283–293. doi: http://dx.doi.org/10.1021/acscentsci.6b00367.
  52. He K. et. all. Identity Mappings in Deep Residual Networks // European Conference on Computer Vision, 2016. P. 630–645.
  53. Jurgen Bajorath Data analytics and deep learning in medicinal chemistry // Future Med. Chem., 2018. doi: http://dx.doi.org/10.4155/fmc-2018-0129.
  54. Chen H., Engkvist O., Wang Y., Olivecrona M., Blaschke T. The rise of deep learning in drug discovery // Drug Discovery Today, 2018. Vol. 23. Issue 6. P. 1241–1250. doi: http://dx.doi.org/10.1016/j.drudis.2018.01.039.
  55. Altae-Tran H., Ramsundar B., Pappu A.S., Pande V. Low Data Drug Discovery with One-Shot Learning // ACS Central Science, 2017. Vol. 3. Issue 4 P. 283–293. doi: http://dx.doi.org/10.1021/acscentsci.6b00367.

Список всех публикаций, цитирующих данную статью

Copyright © 2025 UzScite | E-LINE PRESS