163
Views
1
CrossRef citations to date
0
Altmetric
Computers and computing

Malayalam Question Answering System Using Deep Learning Approaches

, &

References

  • E. M. Voorhees, and D. M. Tice. “Building a question answering test collection.” in: Proceedings of the 23rd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, 2000, pp. 200–207.
  • P. Gupta, and V. Gupta, “A survey of text question answering techniques,” International Journal of Computer Applications, Vol. 53, no. 4, pp. 1–8, 2012. DOI: 10.5120/8406-2030.
  • KPMG Google. “Indian languages – Defining India’s internet,” 2017.
  • R. R. K, and R. R. PC. “A memory based approach to Malayalam noun generation,” in: 2015 International Conference on Control Communication & Computing India, 2015, pp. 634–637, IEEE.
  • J. L. Vicedo, and A. Ferrández. “Importance of pronominal anaphora resolution in question answering systems,” In: Proceedings of the 38th Annual Meeting of the Association for Computational Linguistics, 2000, pp. 555–562.
  • M. A. C. Soares, and F. S. Parreiras, “A literature review on question answering techniques, paradigms and systems,” J. King Saud Univ.-Comput. Inf. Sci., Vol. 32, no. 6, pp. 635–646, 2020. DOU: 10.1016/j.jksuci.2018.08.005.
  • S. Quarteroni, and S. Manandhar, “Designing an interactive open-domain question answering system,” Nat. Lang. Eng., Vol. 15, no. 1, pp. 73–95, 2009. DOI: 10.1017/S1351324908004919.
  • E. Damiano, R. Spinelli, M. Esposito, and G. De Pietro. “Towards a framework for closed-domain question answering in italian.” in: 2016 12th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS), 2016, pp. 604–611, IEEE.
  • R. Baradaran, R. Ghiasi, and H. Amirkhani. “A survey on machine reading comprehension systems,” arXiv preprint arXiv:2001.01582,2020.
  • L. R. Medsker, and L. Jain, “Recurrent neural networks,” Des. Appl., Vol. 5, pp. 64–67, 2001.
  • S. Albawi, T. A. Mohammed, and S. Al-Zawi. “Understanding of a convolutional neural network,” in: 2017 International Conference on Engineering and Technology (ICET), 2017, pp. 1–6, IEEE.
  • S. Hochreiter, and J. Schmidhuber, “Long short-term memory,” Neural Comput., Vol. 9, no. 8, pp. 1735–1780, 1997. DOI: 10.1162/neco.1997.9.8.1735.
  • G. Rohit, E. G. Dharamshi, and N. Subramanyam. “Approaches to question answering using lstm and memory networks,” in: Soft Computing for Problem Solving, 2019, pp. 199–209, Springer.
  • R. Dey, and F. M. Salem. “Gate-variants of gated recurrent unit (gru) neural networks,” in 2017 IEEE 60th International Midwest Symposium on Circuits and Systems (MWSCAS), 2017, pp. 1597–1600. IEEE.
  • S. Viswanathan, M. A. Kumar, and K. Soman. “A sequencebased machine comprehension modeling using lstm and gru,” in: Emerging Research in Electronics, Computer Science and Technology, 2019, pp. 47–55. Springer.
  • J. Weston, S. Chopra, and A. Bordes. “Memory networks,” arXiv preprint arXiv:1410.3916, 2014.
  • Y. Sharma, and S. Gupta, “Deep learning approaches for question answering system,” Procedia. Comput. Sci., Vol. 132, pp. 785–794, 2018. DOI: 10.1016/j.procs.2018.05.090.
  • I.T. Seena, G .M. Sini, and R. Binu, “Malayalam question answering system,” Procedia Technol., Vol. 24, pp. 1388–1392, 2016. DOI: 10.1016/j.protcy.2016.05.155.
  • C. Sreejith, K. Nibeesh, and P. C. Reghu Raj. “Chodyothari”: Question answering system for malayalam. Proceedings of the Fourth National Technological Congress (Natcon), 2014.
  • S. M. Archana, N. Vahab, R. Thankappan, and C. Raseek, “A rule based question answering system in Malayalam corpus using vibhakthi and POS tag analysis,” Procedia Technol., Vol. 24, pp. 1534–1541, 2016. DOI: 10.1016/j.protcy.2016.05.124.
  • M. Bindu, and I. S. Mary. “Design and development of a named entity based question answering system for malayalam language.” PhD dissertation, Cochin University of Science and Technology, 2012.
  • B. A.  Bibin, “Malayalam questions classification in question answering systems using support vector machine,” Int. J. Comput. Sci. Eng., Vol. 7, pp. 724–729, 2019.
  • M. S. Bindu, and S. M. Idicula, “Named entity recognizer employing multiclass support vector machines for the development of question answering systems,” Int. J. Comput. Appl., Vol. 25, no. 10, pp. 40–46, 2011. DOI: 10.5120/ijca.
  • U. Sasikumar, and L. S, “A survey of natural language question answering system,” Int. J. Comput. Appl., Vol. 108, no. 15, pp. 42–46, 2014. DOI: 10.5120/18991-0444.
  • P. Athira, M. Sreeja, and P. Reghuraj, “Architecture of an ontology-based domain-specific natural language question answering system,” Int. J. Web Semant. Technol., Vol. 4, no. 4, pp. 31–39, 2013. DOI: 10.5121/ijwest.
  • Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, Vol. 521, no. 7553, pp. 436–444, 2015. DOI: 10.1038/nature14539.
  • A. Shrestha, and A. Mahmood, “Review of deep learning algorithms and architectures,” IEEE Access., Vol. 7, pp. 53040–53065, 2019. DOI: 10.1109/Access.6287639.
  • H. Hewamalage, C. Bergmeir, and K. Bandara, “Recurrent Neural Networks for Time Series Forecasting: Current status and future directions,” Int. J. Forecast., Vol. 37, no. 1, pp. 388–427, 2021. DOI: 10.1016/j.ijforecast.2020.06.008.
  • G. Pascanu, and B. Cho. “How to construct deep recurrent neural networks,” International Conference on Learning Representations, 2014.
  •  B. Glorot. “Understanding the difficulty of training deep feed forward neural networks,” International Conference on Artificial Intelligence and Statistics, 2010.
  • Y. Bengio, N. Boulanger-Lewandowski, and R. Pascanu. “Advances in optimizing recurrent networks,” in: 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, 2013, pp. 8624–8628, IEEE.
  • C. Gao, J. Yan, S. Zhou, P. K. Varshney, and H. Liu, “Long short-term memory-based deep recurrent neural networks for target tracking,” Inf. Sci. (Ny), Vol. 502, pp. 279–296, 2019. DOI: 10.1016/j.ins.2019.06.039.
  • M. Sundermeyer, R. Schlüter, and H. Ney. “Lstm neural networks for language modeling,” in Thirteenth Annual Conference of the International Speech Communication Association, 2012.
  • K. Cho, B. Van Merriënboer, C. Gulcehre, D. Bahdanau, F. Bougares, H. Schwenk, and Y. Bengio. “Learning phrase representations using rnn encoder-decoder for statistical machine translation,” arXiv preprint arXiv:1406.1078, 2014.
  • S. Sukhbaatar, A. Szlam, J. Weston, and R. Fergus. “End-to-end memory networks,” arXiv preprint arXiv:1503.08895, 2015.
  • J. Weston, A. Bordes, S. Chopra, A. M. Rush, B. van Merriënboer, A. Joulin, and T. Mikolov. “Towards ai-complete question answering: A set of prerequisite toy tasks,” arXiv preprint arXiv:1502.05698, 2015.
  • H. Levesque, E. Davis, and L. Morgenstern. “The winograd schema challenge,” in Thirteenth International Conference on the Principles of Knowledge Representation and Reasoning, 2012.
  • D. Chen, and R. Mooney. “Learning to interpret natural language navigation instructions from observations,” in: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 25, 2011.
  • D. Grigar. “Twisty little passages: An approach to interactive fiction by Nick Montfort.” MIT Press, Cambridge, MA, USA, 2003. p. 286, illus. Trade. ISBN: 0-262-13436-5. MIT Press (2005).
  • E. Gordon-Rodriguez, G. Loaiza-Ganem, G. Pleiss, and J. P. Cunningham. “Uses and abuses of the cross-entropy loss: Case studies in modern deep learning,” 2020.
  • D. P. Kingma, and J. Ba. “Adam: A method for stochastic optimization,” arXiv preprint arXiv:1412.6980, 2014.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.