Host: The Japanese Society for Artificial Intelligence
Name : The 33rd Annual Conference of the Japanese Society for Artificial Intelligence, 2019
Number : 33
Location : [in Japanese]
Date : June 04, 2019 - June 07, 2019
We propose a second-order approximated solution for Logistic Regression with strong L2 regularization based on matrix operations. As training a Logistic Regression model is a convex optimization, researchers have efficient techniques solving it, such as Gradient Descent. But, to our best knowledge, a solution in the form of matrix operations has not been revealed. Generally speaking, matrix operation is faster and more convenient than solving optimization problems. In principle, the matrix-operation approximated solution is only applicable to Logistic Regression with very strong L2 regularization, however, it also works as a pretty good approximated solution in our empirical analysis even when the L2 regularization strength is set in a practical range. This method can also generate good parameter initialization efficiently. The mathematical proof is presented in this paper.