|Abstract:||Machine Learning (ML) is the science that enables computers with the ability to learn without being explicitly programmed. ML is so pervasive today, with applications in speech recognition, recommendation systems, fraud detection and many more that we may not be aware of. To facilitate a rapid pace of development, it is important to create a framework with modularity and reusability. Learning Based Java (LBJava) was introduced by Cognitive Computation Group (CCG) to achieve such goal.
This thesis extends and introduces multiple components in LBJava. We begin by giving a comprehensive literature review relates to Learning Based Programming (LBP) and LBJava.
Then we introduce regression evaluation metrics to LBJava. In addition, we introduce Adaptive Sub- Gradient (AdaGrad) for regression. Then we add a comprehensive tutorial with example on regression. Furthermore, we extend both SGD and AdaGrad algorithms for classification. Then we evaluate across var- ious learning algorithms, with sparse and dense features, using large programmatically generated datasets.
Moreover, we introduce Neural Network (NN), in particular, Multilayer Perceptron (MLP), to LBJava. We also did some miscellaneous work.
Lastly, we conclude on all the extended and added components and provide recommendations for future work.