|
[1] Johan Bollen, Huina Mao, and Xiao-Jun Zeng. Twitter mood predicts the stock mar- ket. doi: 10.1016/j.jocs.2010.12.007. [2] Venkata Sasank Pagolu, Kamal Nayan Reddy Challa, Ganapati Panda, and Babita Majhi. Sentiment analysis of twitter data for predicting stock market movements. [3] Xiang Zhang, Junbo Jake Zhao, and Yann LeCun. Character-level convolutional net- works for text classification. CoRR, abs/1509.01626, 2015. URL http://arxiv. org/abs/1509.01626. [4] Aliaksei Severyn and Alessandro Moschitti. Unitn: Training deep convolutional neu- ral network for twitter sentiment classification. In SemEval@ NAACL-HLT, pages 464–469, 2015. [5] Baruch Lev and S Ramu Thiagarajan. Fundamental information analysis. Journal of Accounting research, pages 190–215, 1993. [6] Patricia M Dechow, Amy P Hutton, Lisa Meulbroek, and Richard G Sloan. Short-sellers, fundamental analysis, and stock returns. Journal of Financial Eco- nomics, 61(1):77 – 106, 2001. ISSN 0304-405X. doi: http://dx.doi.org/10. 55 1016/S0304-405X(01)00056-3. URL http://www.sciencedirect.com/ science/article/pii/S0304405X01000563. [7] Eugene F. Fama. Efficient capital markets: A review of theory and empirical work. The Journal of Finance, 25(2):383–417, 1970. ISSN 00221082, 15406261. URL http://www.jstor.org/stable/2325486. [8] WILLIAM BROCK, JOSEF LAKONISHOK, and BLAKE LeBARON. Simple tech- nical trading rules and the stochastic properties of stock returns. The Journal of Fi- nance, 47(5):1731–1764, 1992. ISSN 1540-6261. doi: 10.1111/j.1540-6261.1992. tb04681.x. URL http://dx.doi.org/10.1111/j.1540-6261.1992. tb04681.x. [9] Rodolfo Torbio Farias Nazrio, Jssica Lima e Silva, Vinicius Amorim Sobreiro, and Herbert Kimura. A literature review of technical analysis on stock mar- kets. The Quarterly Review of Economics and Finance, pages –, 2017. ISSN 1062-9769. doi: https://doi.org/10.1016/j.qref.2017.01.014. URL http://www. sciencedirect.com/science/article/pii/S1062976917300443. [10] T. Kimoto, K. Asakawa, M. Yoda, and M. Takeoka. Stock market prediction system with modular neural networks. In 1990 IJCNN International Joint Conference on Neural Networks, pages 1–6 vol.1, June 1990. doi: 10.1109/IJCNN.1990.137535. [11] Xiao Ding, Yue Zhang, Ting Liu, and Junwen Duan. Deep learning for event-driven 56 stock prediction. In Proceedings of the 24th International Conference on Artificial In- telligence, IJCAI’15, pages 2327–2333. AAAI Press, 2015. ISBN 978-1-57735-738- 4. URL http://dl.acm.org/citation.cfm?id=2832415.2832572. [12] KAZUHIRO KOHARA, TSUTOMU ISHIKAWA, YOSHIMI FUKUHARA, and YUKIHIRO NAKAMURA. Stock price prediction using prior knowledge and neu- ral networks. Intelligent Systems in Accounting, Finance & Management, 6(1): 11–22, 1997. ISSN 1099-1174. doi: 10.1002/(SICI)1099-1174(199703)6:1⟨11:: AID-ISAF115⟩3.0.CO;2-3. URL http://dx.doi.org/10.1002/(SICI) 1099-1174(199703)6:1<11::AID-ISAF115>3.0.CO;2-3. [13] Dau-Heng Hsu. Auto-identify the influence of events based on stock newsauto- identify the influence of events based on stock news. Master’s thesis, National Tsing Hua University, 2012. [14] Yi Zuo and Eisuke Kita. Stock price forecast using bayesian network. Expert Sys- tems with Applications, 39(8):6729 – 6737, 2012. ISSN 0957-4174. doi: http: //dx.doi.org/10.1016/j.eswa.2011.12.035. URL http://www.sciencedirect. com/science/article/pii/S0957417411017064. [15] Lean Yu, Shouyang Wang, and Kin Keung Lai. Neural network-based mean–variance–skewness model for portfolio selection. Computers & Op- erations Research, 35(1):34 – 46, 2008. ISSN 0305-0548. doi: http://dx. doi.org/10.1016/j.cor.2006.02.012. URL http://www.sciencedirect.com/ science/article/pii/S0305054806000505. Part Special Issue: Applica- tions of OR in Finance. 57 [16] Richard Socher, Danqi Chen, Christopher D Manning, and Andrew Ng. Reasoning with neural tensor networks for knowledge base completion. In C. J. C. Burges, L. Bottou, M. Welling, Z. Ghahramani, and K. Q. Weinberger, editors, Advances in Neural Information Processing Systems 26, pages 926–934. Curran Associates, Inc., 2013. URL https://goo.gl/P8cBeb. [17] Y. Deng, F. Bao, Y. Kong, Z. Ren, and Q. Dai. Deep direct reinforcement learning for financial signal representation and trading. IEEE Transactions on Neural Networks and Learning Systems, 28(3):653–664, March 2017. ISSN 2162-237X. doi: 10.1109/ TNNLS.2016.2522401. [18] Alexander M. Rush, Sumit Chopra, and Jason Weston. A neural attention model for abstractive sentence summarization. [19] Haim Sak, Andrew Senior, and Franoise Beaufays. Long short-term memory based recurrent neural network architectures for large vocabulary speech recognition. [20] Douglas Eck and Juergen Schmidhuber. A first look at music composition using lstm recurrent neural networks. Technical report, 2002. [21] Chun-Chi J. Chen and Risto Miikkulainen. Creating melodies with evolving recurrent neural networks. In Proceedings of the INNS-IEEE International Joint Conference on Neural Networks, pages 2241–2246, Piscataway, NJ, 2001. IEEE. URL http: //nn.cs.utexas.edu/?chen:ijcnn01. [22] Jeffrey L. Elman. Distributed representations, simple recurrent networks, and gram- matical structure. Machine Learning, 7(2):195–225, Sep 1991. ISSN 1573-0565. 58 doi: 10.1023/A:1022699029236. URL http://dx.doi.org/10.1023/A: 1022699029236. [23] Y. LeCun, B. Boser, J. S. Denker, D. Henderson, R. E. Howard, W. Hubbard, and L. D. Jackel. Backpropagation applied to handwritten zip code recognition. Neural Comput., 1(4):541–551, December 1989. ISSN 0899-7667. doi: 10.1162/neco.1989. 1.4.541. URL http://dx.doi.org/10.1162/neco.1989.1.4.541. [24] Alex Krizhevsky, Ilya Sutskever, and Geoffrey E Hinton. Imagenet classification with deep convolutional neural networks. In F. Pereira, C. J. C. Burges, L. Bottou, and K. Q. Weinberger, editors, Advances in Neural Information Processing Systems 25, pages 1097–1105. Curran Associates, Inc., 2012. URL https://goo.gl/ 6UhXmK. [25] KarenSimonyanandAndrewZisserman.Verydeepconvolutionalnetworksforlarge- scale image recognition. CoRR, abs/1409.1556, 2014. URL http://arxiv.org/ abs/1409.1556. [26] Christian Szegedy, Sergey Ioffe, and Vincent Vanhoucke. Inception-v4, inception- resnet and the impact of residual connections on learning. CoRR, abs/1602.07261, 2016. URL http://arxiv.org/abs/1602.07261. [27] Jeff Donahue, Lisa Anne Hendricks, Sergio Guadarrama, Marcus Rohrbach, Sub- hashini Venugopalan, Kate Saenko, and Trevor Darrell. Long-term recurrent convo- lutional networks for visual recognition and description. CoRR, abs/1411.4389, 2014. URL http://arxiv.org/abs/1411.4389. 59 [28] Hyeonwoo Noh, Paul Hongsuck Seo, and Bohyung Han. Image question answer- ing using convolutional neural network with dynamic parameter prediction. CoRR, abs/1511.05756, 2015. URL http://arxiv.org/abs/1511.05756. [29] Ronan Collobert, Jason Weston, Le ́on Bottou, Michael Karlen, Koray Kavukcuoglu, and Pavel P. Kuksa. Natural language processing (almost) from scratch. CoRR, abs/1103.0398, 2011. URL http://arxiv.org/abs/1103.0398. [30] Tomas Mikolov, Kai Chen, Greg Corrado, and Jeffrey Dean. Efficient estimation of word representations in vector space. CoRR, abs/1301.3781, 2013. URL http: //arxiv.org/abs/1301.3781. [31] Tobias Schnabel, Igor Labutov, David M Mimno, and Thorsten Joachims. Evaluation methods for unsupervised word embeddings. In EMNLP, pages 298–307, 2015. [32] D. E. Rumelhart, G. E. Hinton, and R. J. Williams. Parallel distributed processing: Explorations in the microstructure of cognition, vol. 1. chapter Learning Internal Representations by Error Propagation, pages 318–362. MIT Press, Cambridge, MA, USA, 1986. ISBN 0-262-68053-X. URL http://dl.acm.org/citation. cfm?id=104279.104293. [33] Y.Bengio,P.Simard,andP.Frasconi.Learninglong-termdependencieswithgradient descent is difficult. Trans. Neur. Netw., 5(2):157–166, March 1994. ISSN 1045-9227. doi: 10.1109/72.279181. URL http://dx.doi.org/10.1109/72.279181. [34] SeppHochreiterandJu ̈rgenSchmidhuber.Longshort-termmemory.NeuralComput., 60 9(8):1735–1780, November 1997. ISSN 0899-7667. doi: 10.1162/neco.1997.9.8. 1735. URL http://dx.doi.org/10.1162/neco.1997.9.8.1735. [35] Junyoung Chung, C ̧aglar Gu ̈lc ̧ehre, KyungHyun Cho, and Yoshua Bengio. Empir- ical evaluation of gated recurrent neural networks on sequence modeling. CoRR, abs/1412.3555, 2014. URL http://arxiv.org/abs/1412.3555. |