INDUCTIVE inference is the problem of reasoning under conditions of incomplete information, or uncertainty. According to Shanonn’s theory ([Cover]), information and uncertainty are two sides of the same coin: the more uncertainty there is, the more information we gain by removing the uncertainty. Entropy plays central roles in many scientific realms ranging from physics and statistics to data science and economics. A basic problem in information theory is encoding large quantities of information ([Cover]). Shanonn’s discovery of the fundamental laws of data compression and transmission marked the birth of Information Theory. In his fundamental paper of 1948, A Mathematical Theory of Communication ([Shannon1948]), Shannon proposed a measure of the uncertainty associated with a random memoryless source, called entropy
Multivariable methods, such as Logistic regression are routinely utilized in statistical analyses across a wide range of domains. Particularly, Logistic regression is the most frequently used method for modeling binary response data and binary classification. When the response variable is binary, it characteristically takes the form of 1/0, with 1 normally indicating a success and 0 a failure. Multivariable methods usually assume a relationship between two or more independent, predictor variables, and one dependent, response variable. The predicted value of a response variable may be expresses as a sum of products, wherein each product is formed by multiplying the value of the variable and its coefficient. How the coefficients are computed? from a respective data-set. Logistic regression is heavily used in supervised machine learning and has become the workhorse for both Binary and Multiclass classification problems. Many of the questions introduced in this chapter and in the Information Theory chapter, are crucial for truly understanding the inner-workings of artificial neural networks.