Showing posts with label Data Visualization. Show all posts
Showing posts with label Data Visualization. Show all posts

Tuesday, 14 August 2018

What is Artificial Intelligence?

It has been said that Artificial Intelligence will define the next generation of software solutions. If you are even remotely involved with technology, you will almost certainly have heard the term with increasing regularity over the last few years. It is likely that you will also have heard different definitions for Artificial Intelligence offered, such as:

“The ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings.” – Encyclopedia Britannica

“Intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans.” – Wikipedia

How useful are these definitions? What exactly are “tasks commonly associated with intelligent beings”? For many people, such definitions can seem too broad or nebulous. After all, there are many tasks that we can associate with human beings! What exactly do we mean by “intelligence” in the context of machines, and how is this different from the tasks that many traditional computer systems are able to perform, some of which may already seem to have some level of intelligence in their sophistication? What exactly makes the Artificial Intelligence systems of today different from sophisticated software systems of the past?

Artificial Intelligence, Azure Certification, Azure Guides, Azure Learning, Azure Tutorial and Materials

It could be argued that any attempt to try to define “Artificial Intelligence” is somewhat futile, since we would first have to properly define “intelligence”, a word which conjures a wide variety of connotations. Nonetheless, this article attempts to offer a more accessible definition for what passes as Artificial Intelligence in the current vernacular, as well as some commentary on the nature of today’s AI systems, and why they might be more aptly referred to as “intelligent” than previous incarnations.

Firstly, it is interesting and important to note that the technical difference between what used to be referred to as Artificial Intelligence over 20 years ago and traditional computer systems, is close to zero. Prior attempts to create intelligent systems known as expert systems at the time, involved the complex implementation of exhaustive rules that were intended to approximate intelligent behavior. For all intents and purposes, these systems did not differ from traditional computers in any drastic way other than having many thousands more lines of code. The problem with trying to replicate human intelligence in this way was that it requires far too many rules and ignores something very fundamental to the way intelligent beings make decisions, which is very different from the way traditional computers process information.

Let me illustrate with a simple example. Suppose I walk into your office and I say the words “Good Weekend?” Your immediate response is likely to be something like “yes” or “fine thanks”. This may seem like very trivial behavior, but in this simple action you will have immediately demonstrated a behavior that a traditional computer system is completely incapable of. In responding to my question, you have effectively dealt with ambiguity by making a prediction about the correct way to respond. It is not certain that by saying “Good Weekend” I actually intended to ask you whether you had a good weekend. Here are just a few possible intents behind that utterance:

◈ Did you have a good weekend?
◈ Weekends are good (generally).
◈ I had a good weekend.
◈ It was a good football game at the weekend, wasn’t it?
◈ Will the coming weekend be a good weekend for you?

And more.

Artificial Intelligence, Azure Certification, Azure Guides, Azure Learning, Azure Tutorial and Materials

The most likely intended meaning may seem obvious, but suppose that when you respond with “yes”, I had responded with “No, I mean it was a good football game at the weekend, wasn’t it?”. It would have been a surprise, but without even thinking, you will absorb that information into a mental model, correlate the fact that there was an important game last weekend with the fact that I said “Good Weekend?” and adjust the probability of the expected response for next time accordingly so that you can respond correctly next time you are asked the same question. Granted, those aren’t the thoughts that will pass through your head! You happen to have a neural network (aka “your brain”) that will absorb this information automatically and learn to respond differently next time.

The key point is that even when you do respond next time, you will still be making a prediction about the correct way in which to respond. As before, you won’t be certain, but if your prediction fails again, you will gather new data, which leads to my suggested definition of Artificial Intelligence, as it stands today:

“Artificial Intelligence is the ability of a computer system to deal with ambiguity, by making predictions using previously gathered data, and learning from errors in those predictions in order to generate newer, more accurate predictions about how to behave in the future”.

This is a somewhat appropriate definition of Artificial Intelligence because it is exactly what AI systems today are doing, and more importantly, it reflects an important characteristic of human beings which separates us from traditional computer systems: human beings are prediction machines. We deal with ambiguity all day long, from very trivial scenarios such as the above, to more convoluted scenarios that involve playing the odds on a larger scale. This is in one sense the essence of reasoning. We very rarely know whether the way we respond to different scenarios is absolutely correct, but we make reasonable predictions based on past experience.

Just for fun, let’s illustrate the earlier example with some code in R! If you are not familiar with R, but would like to follow along. First, lets start with some data that represents information in your mind about when a particular person has said “good weekend?” to you.

Artificial Intelligence, Azure Certification, Azure Guides, Azure Learning, Azure Tutorial and Materials

In this example, we are saying that GoodWeekendResponse is our score label (i.e. it denotes the appropriate response that we want to predict). For modelling purposes, there have to be at least two possible values in this case “yes” and “no”. For brevity, the response in most cases is “yes”.

We can fit the data to a logistic regression model:

library(VGAM)
greetings=read.csv('c:/AI/greetings.csv',header=TRUE)
fit <- vglm(GoodWeekendResponse~., family=multinomial, data=greetings)

Now what happens if we try to make a prediction on that model, where the expected response is different than we have previously recorded? In this case, I am expecting the response to be “Go England!”. Below, some more code to add the prediction. For illustration we just hardcode the new input data, output is shown in bold:

response <- data.frame(FootballGamePlayed="Yes", WorldCup="Yes", EnglandPlaying="Yes", GoodWeekendResponse="Go England!!")
greetings <- rbind(greetings, response)
fit <- vglm(GoodWeekendResponse~., family=multinomial, data=greetings)
prediction <- predict(fit, response, type="response")
prediction
index <- which.max(prediction)
df <- colnames(prediction)
df[index]

            No Yes Go England!!
1 3.901506e-09 0.5          0.5
> index <- which.max(prediction)
> df <- colnames(prediction)
> df[index]
[1] "Yes"

The initial prediction “yes” was wrong, but note that in addition to predicting against the new data, we also incorporated the actual response back into our existing model. Also note, that the new response value “Go England!” has been learnt, with a probability of 50 percent based on current data. If we run the same piece of code again, the probability that “Go England!” is the right response based on prior data increases, so this time our model chooses to respond with “Go England!”, because it has finally learnt that this is most likely the correct response!

            No       Yes Go England!!
1 3.478377e-09 0.3333333    0.6666667
> index <- which.max(prediction)
> df <- colnames(prediction)
> df[index]
[1] "Go England!!"

Do we have Artificial Intelligence here? Well, clearly there are different levels of intelligence, just as there are with human beings. There is, of course, a good deal of nuance that may be missing here, but nonetheless this very simple program will be able to react, with limited accuracy, to data coming in related to one very specific topic, as well as learn from its mistakes and make adjustments based on predictions, without the need to develop exhaustive rules to account for different responses that are expected for different combinations of data. This is this same principle that underpins many AI systems today, which, like human beings, are mostly sophisticated prediction machines. The more sophisticated the machine, the more it is able to make accurate predictions based on a complex array of data used to train various models, and the most sophisticated AI systems of all are able to continually learn from faulty assertions in order to improve the accuracy of their predictions, thus exhibiting something approximating human intelligence.

Machine learning


You may be wondering, based on this definition, what the difference is between machine learning and Artificial intelligence? After all, isn’t this exactly what machine learning algorithms do, make predictions based on data using statistical models? This very much depends on the definition of machine learning, but ultimately most machine learning algorithms are trained on static data sets to produce predictive models, so machine learning algorithms only facilitate part of the dynamic in the definition of AI offered above. Additionally, machine learning algorithms, much like the contrived example above typically focus on specific scenarios, rather than working together to create the ability to deal with ambiguity as part of an intelligent system. In many ways, machine learning is to AI what neurons are to the brain. A building block of intelligence that can perform a discreet task, but that may need to be part of a composite system of predictive models in order to really exhibit the ability to deal with ambiguity across an array of behaviors that might approximate to intelligent behavior.

Practical applications


There are a number of practical advantages in building AI systems, but as discussed and illustrated above, many of these advantages are pivoted around “time to market”. AI systems enable the embedding of complex decision making without the need to build exhaustive rules, which traditionally can be very time consuming to procure, engineer and maintain. Developing systems that can “learn” and “build their own rules” can significantly accelerate organizational growth.

Microsoft’s Azure cloud platform offers an array of discreet and granular services in the AI and Machine Learning domain, that allow AI developers and Data Engineers to avoid re-inventing wheels, and consume re-usable APIs. These APIs allow AI developers to build systems which display the type of intelligent behavior discussed above.

Saturday, 12 May 2018

Enhancements in Application Insights Profiler and Snapshot Debugger

We are pleased to announce a series of improvements on Application Insights Profiler and Snapshot Debugger. Profiler identifies the line of code that slowed down the web app performance under load. Snapshot Debugger captures runtime exception call stack and local variables to identify the issue in code. To ensure users can easily and conveniently use the tools, we delivered the following new features for Profiler and Snapshot Debugger:

Application Insights enablement with Profiler and Snapshot Debugger


With the newly enhanced Application Insights enablement experience, Profiler and Snapshot Debugger are default options to be turned on with Application Insights.

◈ Enabling Snapshot Debugger without redeploy your web app: For ASP.NET core web app, snapshot debugger is a simple, default option when enabling App Insights. It used to require modifying the project to install NuGet and add exception tracking code. Now it’s done via an ASP.NET core hosting light up through an App Setting, no redeploy will be required. ASP.NET support will be available very soon.

◈ Enabling Profiler with Application Insights in one step: Enabling Profiler used to be done in a separate Profiler Configuration pane, which requires extra steps. This is no longer needed.

Profiler

◈ On-demand profiler: Triggering a profiler run session on your web app anytime as needed. Before, Profiler would run randomly 5% of the time, which could miss capturing critical traces. With the new on-demand profiler feature, this problem is solved as users can capture traces anytime as needed.

◈ Profiler for ASP.NET core on Linux: Profiler now works on App Services Linux ASP.NET core 2.0 docker images. More platforms will be supported in the future.

Snapshot Debugger

◈ Snapshot healthy check: Smartly diagnose why web app runtime exceptions do not have associated snapshot. Easily and quickly troubleshooting snapshot debugger with more insights and visibility.

Enable Profiler and Snapshot Debugger is now easier than ever


We enhanced the App Insights enablement experience for App Services. Suppose you have deployed a Web Application to an App Services resource. Later, you notice your web app is being slow or throwing exceptions. You would want to enable App Insights on your Web App to monitor and diagnose what’s going on. Of course, you don’t want to redeploy the web app just to enable monitoring service.

With the new enablement experience, you can easily find the entry point to enable App Insights under Settings | Application Insights. The added section Code level diagnostics is on by default to enable Profiler and Snapshot Debugger for diagnosing slow app performance and runtime exceptions.

Profiler can be enabled easily like this because Profiler agent is installed in the new App Insights site extension and enabled by an App Setting. Snapshot Debugger is enabled through ASP.NET core hosting light up, the runtime will include an assembly if an environment variable is set.

The UI for the new enablement experience allows everything to be configured in one step:

Microsoft Azure Tutorials and Materials, Azure Certifications, Azure Debugger

The following App Settings are added to App Services for enabling Profiler and Snapshot Debugger:

Microsoft Azure Tutorials and Materials, Azure Certifications, Azure Debugger

Capture Interesting Profiler Traces On-Demand


We are excited to introduce the new on-demand triggering profiler feature. To make sure critical traces are not missed, you can go to the Profiler configuration pane and click on Profile Now button to start the profiler as needed. You can trigger profiler run when you are in the following situations:

◈ You want to get started using profiler by capturing the first traces to test everything is working.

◈ You want to efficiently and reliably capture traces during a load test run.

◈ You need to promptly capture traces for performance issues going on now.

In addition, you get more visibility into how profiler has been running from the Profiler run history list.

Microsoft Azure Tutorials and Materials, Azure Certifications, Azure Debugger

Investigate performance for ASP.NET Core on Linux using Profiler


Leveraging the Event Pipe technology, we can now capture traces for ASP.NET core web app running inside a Linux container hosted on App Services. The profiler runs in-proc in ASP.NET core to capture traces, which introduces less overhead. The current preview release is for evaluation purposes only.

Microsoft Azure Tutorials and Materials, Azure Certifications, Azure Debugger

Snapshot Health Check for Quickly Understand and Solve Issues


To address one of our top customer feedback for sometimes they cannot see snapshots for exceptions, we built a new feature to smartly help users diagnose reasons for missing snapshots. The service does health check on Snapshot Debugger based on user input. When missing snapshot, instead of not showing anything on the End-to-End trace viewer blade, we will show a link to help user troubleshoot what’s going on. We hope this can quickly help our customers to root cause and fix issues. We always strive to enable our customers’ success.

Microsoft Azure Tutorials and Materials, Azure Certifications, Azure Debugger

Microsoft Azure Tutorials and Materials, Azure Certifications, Azure Debugger

Tuesday, 31 October 2017

How Microsoft computer scientists and researchers are working to “solve” cancer

Scientists at Microsoft’s research labs are trying to use computer science to solve one of the most complex and deadly challenges humans face: Cancer.

And, for the most part, they are doing so with algorithms and computers instead of test tubes and beakers.

Artificial Intelligence, Cloud Computing, computer vision, Data Visualization, Machine Learning, Microsoft Research

One team of researchers is using machine learning and natural language processing to help oncologists figure out the most effective, individualized cancer treatments for their patients.

Another is pairing machine learning with computer vision to give radiologists a more detailed understanding of how their patients’ tumors are progressing.

Yet another group of researchers has created powerful algorithms that help scientists understand how cancers develop and what treatments will work best to fight them.

And another team is working on moonshot efforts that could one day allow scientists to program cells to fight diseases, including cancer.

While the individual projects vary widely, they share the core philosophy that success depends on both biologists and computer scientists bringing their expertise to the problem.

“The collaboration between biologists and computer scientists is actually key to making this work,” said Jeannette M. Wing, Microsoft’s corporate vice president in charge of the company’s basic research labs.

To learn about these efforts to solve cancer with the help of algorithms and computers, read the full story.