As of 2012, about half of all adults, 117 million people have one or more chronic diseases. One of four adults has two or more chronic diseases (CDC) 84% of all health care spending in 2006 was for the 50% of the population who have one or more chronic health conditions. By 2020 the number of people (adults and children) with chronic conditions is projected to be 157 million people. By 2020, medical information to double every 73 By 2018, 40% of primary care encounters in the U.S. will be delivered virtually. Medical data is expected to double every 73 days by 2020. Also 80% of health data is invisible to current systems because it's unstructured. The measure of a patient's overall health goes beyond medical and genomic conditions. In fact, almost 75% of a patient's status is affected by a host of lifestyle factors like access to shelter, education, income... More...

A good understanding of *big-O analysis* is critical to making a good impression with the interviewer. Big-O analysis is a form of run-time analysis that measures the efficiency of an algorithm in terms of the time it takes for the algorithm to run as a function of the input size. It’s not a formal benchmark, just a simple way to classify algorithms by relative efficiency.

In mathematics, computer science, and related fields, **big-O notation** (also known as **big Oh notation**, **big Omicron notation**, **Landau notation**, **Bachmann–Landau notation**, and **asymptotic notation**) (along with the closely related *big-Omega notation*, *big-Theta notation*, and *little o notation*) describes the limiting behavior of a function when the argument tends towards a particular value or infinity, usually in terms of simpler functions. Big O notation characterizes functions according to their growth rates: different functions with the same growth rate may be represented using the same O notation. More...

"Before there were computers, there were algorithms." - H.Cormen. But now that there are computers, there are even more algorithms, and algorithms lie at the heart of computing. What are algorithms? Informally, an algorithm is any well-defined computational procedure that takes some value, or set of values, as input and produces some value, or set of values, as output. An algorithm is thus a sequence of computational steps that transform the input into the output. We can also view an algorithm as a tool for solving a well-specified computational problem. The statement of the problem specifies in general terms the desired input/output relationship. The algorithm describes a specific computational procedure for achieving that input/output relationship. For example, we might need to sort a sequence of numbers into nondecreasing order. This problem arises frequently in practice and provides fertile ground for introducing many standard design techniques and analysis tools. More...