Showing posts with label Azure iOT Edge. Show all posts
Showing posts with label Azure iOT Edge. Show all posts

Sunday, 8 December 2019

Microsoft has validated the Lenovo ThinkSystem SE350 edge server for Azure Stack HCI

Do you need rugged, compact-sized hyperconverged infrastructure (HCI) enabled servers to run your branch office and edge workloads? Do you want to modernize your applications and IoT functions with container technology? Do you want to leverage Azure's hybrid services such as backup, disaster recovery, update managment, monitoring, and security compliance?

Microsoft Study Material, Microsoft Online Exam, Azure Stack HCI, Azure Certifications

Microsoft and Lenovo have teamed up to validate the Lenovo ThinkSystem SE350 for Microsoft's Azure Stack HCI program. The ThinkSystem SE350 was designed and built with the unique requirements of edge servers in mind. It is versatile enough to stretch the limitations of server locations, providing a variety of connectivity and security options and can be easily managed with Lenovo XClarity Controller. The ThinkSystem SE350 solution has a focus on smart connectivity, business security, and manageability for the harsh environment.

Lenovo ThinkSystem SE350:


The ThinkSystem SE350 is the latest workhorse for the edge. Designed and built with the unique requirements for edge servers in mind, it is versatile enough to stretch the limitations of server locations, providing a variety of connectivity and security options and is easily managed with Lenovo XClarity Controller. The ThinkSystem SE350 is a rugged compact-sized edge solution with a focus on smart connectivity, business security, and manageability for the harsh environment.

The ThinkSystem SE350 is an Intel® Xeon® D processor-based server, with a 1U height, half-width and short depth case that can go anywhere. Mount it on a wall, stack it on a shelf, or install it in a rack. This rugged edge server can handle anything from 0-55°C as well as full performance in high dust and vibration environments.

Information availability is another challenging issue for users at the edge, who require insight into their operations at all times to ensure they are making the right decisions. The ThinkSystem SE350 is designed to provide several connectivity options with wired and secure wireless Wi-Fi and LTE connection ability. This purpose-built compact server is reliable for a wide variety of edge and IoT workloads.

Microsoft Azure Stack HCI:


Azure Stack HCI solutions bring together highly virtualized compute, storage, and networking on industry-standard x86 servers and components. Combining resources in the same cluster makes it easier for you to deploy, manage, and scale. Manage with your choice of command-line automation or Windows Admin Center.

Achieve industry-leading virtual machine (VM) performance for your server applications with Hyper-V, the foundational hypervisor technology of the Microsoft cloud, and Storage Spaces Direct technology with built-in support for non-volatile memory express (NVMe), persistent memory, and remote direct memory access (RDMA) networking.

Help keep apps and data secure with shielded virtual machines, network micro-segmentation, and native encryption.


You can take advantage of cloud and on-premises working together with a hyper-converged infrastructure platform in the public cloud. Your team can start building cloud skills with built-in integration to Azure infrastructure management services:

◉ Azure Site Recovery for high availability and disaster recovery as a service (DRaaS).

◉ Azure Monitor, a centralized hub to track what’s happening across your applications, network, and infrastructure, with advanced analytics powered by artificial intelligence.

◉ Cloud Witness, to use Azure as the lightweight tie-breaker for cluster quorum.

◉ Azure Backup for offsite data protection and to protect against ransomware.

◉ Azure Update Management for update assessment and update deployments for Windows Virtual Machines running in Azure and on-premises.

◉ Azure Network Adapter to connect resources on-premises with your VMs in Azure via a point-to-site VPN.

◉ Sync your file server with the cloud, using Azure File Sync.

◉ Azure Arc for Servers to manage role-based access control, governance, and compliance policy from Azure Portal.

By deploying the Microsoft + Lenovo HCI solution, you can quickly solve your branch office and edge needs with high performance and resiliency while protecting your business assets by enabling the Azure hybrid services built into the Azure Stack HCI Branch office and edge solution.

Tuesday, 14 May 2019

Azure SQL Database Edge: Enabling intelligent data at the edge

The world of data changes at a rapid pace, with more and more data being projected to be stored and processed at the edge. Microsoft has enabled enterprises with the capability of adopting a common programming surface area in their data centers with Microsoft SQL Server and in the cloud with Azure SQL Database. We note that latency, data governance and network connectivity continue to gravitate data compute needs towards the edge. New sensors and chip innovation with analytical capabilities at a lower cost enable more edge compute scenarios to drive higher agility for business.

At Microsoft Build 2019, we announced Azure SQL Database Edge, available in preview, to help address the requirements of data and analytics at the edge using the performant, highly available and secure SQL engine. Developers will now be able to adopt a consistent programming surface area to develop on a SQL database and run the same code on-premises, in the cloud, or at the edge.

Azure SQL Database Edge offers:

◉ Small footprint allows the database engine to run on ARM and x64 devices via the use of containers on interactive devices, edge gateways, and edge servers.

◉ Develop once and deploy anywhere scenarios through a common programming surface area across Azure SQL Database, SQL Server, and Azure SQL Database Edge

◉ Combines data streaming and time-series, with in-database machine learning to enable low latency analytics

◉ Industry leading security capabilities of Azure SQL Database to protect data-at-rest and in- motion on edge devices and edge gateways, and allows management from a central management portal from Azure IoT.

◉ Cloud connected, and fully disconnected edge scenarios with local compute and storage.

◉ Supports existing business intelligence (BI) tools for creating powerful visualizations with Power BI and third-party BI tools.

◉ Bi-directional data movement between the edge to on-premises or the cloud.

◉ Compatible with popular T-SQL language, developers can implement complex analytics using R, Python, Java, and Spark, delivering instant analytics without data movement, and real-time faster insights

Azure SQL Database, Azure Certifications, Azure Learning, Azure Guides, Azure Study Materials

◈ Provides support for processing and storing graph, JSON, and time series data in the database, coupled with the ability to apply our analytics and in-database machine learning capabilities on non-relational datatypes.

For example, manufacturers that employ the use of robotics or automated work processes can achieve optimal efficiencies by using Azure SQL Database Edge for analytics and machine learning at the edge. These real-world environments can leverage in-database machine learning for immediate scoring, initiating corrective actions, and detecting anomalies.

Key benefits:

◈ A consistent programming surface area as Azure SQL Database and SQL Server, the SQL engine at the edge allows engineers to build once for on-premises, in the cloud, or at the edge.
◈ The streaming capability enables instant analysis of the incoming data for intelligent insights.
◈ In-Database AI capabilities enables scenarios like anomaly detection, predictive maintenance and other analytical scenarios without having to move data.

Azure SQL Database, Azure Certifications, Azure Learning, Azure Guides, Azure Study Materials

Train in the cloud and score at the edge


Supporting a consistent Programming Surface Area across on-premises, in the cloud, or at the edge, developers can use identical methods for securing data-in-motion and at-rest while enabling high availability and disaster recovery architectures equal to those used in Azure SQL Database and SQL Server. Giving seamless transition of the application from the various locations means a cloud data warehouse could train an algorithm and push the machine learning model to Azure SQL Database Edge and allow it to run scoring locally, giving real-time scoring using a single codebase.

Intelligent store and forward


The engine provides proficiencies to take streaming datasets and replicate them directly to the cloud, coupled with enabling an intelligent store-and-forward pattern. In duality, the edge can leverage its analytical capabilities while processing streaming data or applying machine learning using in-database machine learning. Fundamentally, the engine can process data locally and upload using native replication to a central datacenter or cloud for aggregated analysis across multiple different edge hubs.

Azure SQL Database, Azure Certifications, Azure Learning, Azure Guides, Azure Study Materials

Thursday, 27 September 2018

Eight new features in Azure Stream Analytics

We are excited to announce eight new features in Azure Stream Analytics (ASA). These new features include

◈ Support for query extensibility with C# custom code in ASA jobs running on Azure IoT Edge.

◈ Custom de-serializers in ASA jobs running on Azure IoT Edge.

◈ Live data Testing in Visual Studio.

◈ High throughput output to SQL.

◈ ML based Anomaly Detection on IoT Edge.

◈ Managed Identities for Azure Resources (formerly MSI) based authentication for egress to Azure Data Lake Storage Gen 1.

◈ Blob output partitioning by custom date/time formats.

◈ User defined custom re-partition count.

The features that are generally available and the ones in public preview will start rolling imminently.

General availability features


Parallel write operations to Azure SQL

Azure Stream Analytics now supports high performance and efficient write operations to Azure SQL DB and Azure SQL Data Warehouse to help customers achieve four to five times higher throughput than what was previously possible. To achieve fully parallel topologies, ASA will transition SQL writes from serial to parallel operations while simultaneously allowing for batch size customizations.

Azure Stream Analytics, Azure Certification, Azure Learning, Azure Study Materials, Azure Guides
Configuring hi-throughput write operation to SQL

Public previews


Query extensibility with C# UDF on Azure IoT Edge

Azure Stream Analytics offers a SQL-like query language for performing transformations and computations over streams of events. Though there are many powerful built-in functions in the currently supported SQL language, there are instances where a SQL-like language doesn't provide enough flexibility or tooling to tackle complex scenarios.

Developers creating Stream Analytics modules for Azure IoT Edge can now write custom C# functions and invoke them right in the query through User Defined Functions.  This enables scenarios like complex math calculations, importing custom ML models using ML.NET and programming custom data imputation logic. Full fidelity authoring experience is made available in Visual Studio for these functions. 

Azure Stream Analytics, Azure Certification, Azure Learning, Azure Study Materials, Azure Guides
Definition of the C# UDF in Visual Studio

Azure Stream Analytics, Azure Certification, Azure Learning, Azure Study Materials, Azure Guides
Calling the C# UDF from ASA Query

Output partitioning to Azure Blob Storage by custom date and time formats

Azure Stream Analytics users can now partition output to Azure Blob storage based on custom date and time formats. This feature greatly improves downstream data-processing workflows by allowing fine-grained control over the blob output especially for scenarios such as dashboarding and reporting. In addition, partition by custom date and time formats enables stronger alignment with downstream Hive supported formats and conventions when consumed by services such as Azure HDInsight or Azure Databricks.

Azure Stream Analytics, Azure Certification, Azure Learning, Azure Study Materials, Azure Guides
Partition by custom date or time on Azure portal

Ability to partition output to Azure Blob storage by custom field or attribute continues to be in private preview.

Azure Stream Analytics, Azure Certification, Azure Learning, Azure Study Materials, Azure Guides
Setting partition by custom attribute on Azure portal

Live data testing in Visual Studio

Available immediately, Visual Studio tooling for Azure Stream Analytics further enhances the local testing capability to help users test their queries against live data or event streams from cloud sources such as Azure Event Hubs or IoT hub. This includes full support for Stream Analytics time policies in local simulated Visual Studio IDE environment.

This significantly shortens development cycles as developers no longer need to start/stop their job to run test cycles. Also, this feature provides a fluent experience for checking the live output data while the query is running.

Azure Stream Analytics, Azure Certification, Azure Learning, Azure Study Materials, Azure Guides
Live Data Testing in Visual Studio IDE

User defined custom re-partition count

We are extending our SQL language to optionally enable users to specify the number of partitions of a stream when performing repartitioning. This will enable better performance tuning for scenarios where the partition key can’t be changed to upstream constraints, or when we have fixed number of partitions for output, or partitioned processing is needed to scale out to larger processing load. Once repartitioned, each partition is processed independently of others.

With this new language feature, query developer can simply use a newly introduced keyword INTO after PARTITION BY statement. For example, the query below reads from the input stream (regardless of it being naturally partitioned) and repartition the stream into 10 based on the DeviceID dimension and flush the data to output.

SELECT * INTO [output] FROM [input] PARTITION BY DeviceID INTO 10

Private previews


Built-in models for Anomaly Detection on Azure IoT Edge and cloud

By providing ready-to-use ML models right within our SQL-like language, we empower every developer to easily add Anomaly Detection capabilities to their ASA jobs, without requiring them to develop and train their own ML models. This in effect reduces the whole complexity associated with building ML models to a simple single function call.

Currently, this feature is available for private preview in cloud, and we are happy to announce that these ML functions for built-in Anomaly Detection are also being made available for ASA modules running on Azure IoT Edge runtime. This will help customers who demand sub-second latencies, or within scenarios where connectivity to cloud is unreliable or expensive.

In this latest round of enhancements, we have been able to reduce the number of functions from five to two while still detecting all five kinds of anomalies of Spike, Dip, Slow positive increase, Slow negative decrease, and Bi-level changes. Also, our tests are showing a remarkable five to ten times improvement in performance.

Sedgwick is a global provider of technology enabled risk, benefits and integrated business solutions who has been engaged with us as an early adopter for this feature.

“Sedgwick has been working directly with Stream Analytics engineering team to explore and operationalize compelling scenarios for Anomaly Detection using built-in functions in the Stream Analytics Query language. We are convinced this feature holds a lot of potential for our current and future scenarios”.

Custom de-serializers in Stream Analytics module on Azure IoT Edge


Today, Azure Stream Analytics supports input events in JSON, CSV or AVRO data formats out of the box. However, millions of IoT devices are often optimized to generate data in other formats to encode structured data in a more efficient yet extensible format.

Going forward, IoT devices sending data in any format can leverage the power of Azure Stream Analytics. Be it Parquet, Protobuf, XML or any binary format. Developers can now implement custom de-serializers in C# which can then be used to de-serialize events received by Azure Stream Analytics.

Azure Stream Analytics, Azure Certification, Azure Learning, Azure Study Materials, Azure Guides
Configuring input with a custom serialization format

Managed identities for Azure resources (formerly MSI) based authentication for egress to Azure Data Lake Storage Gen 1

Users of Azure Stream Analytics will now be able to operationalize their real-time pipelines with MSI based authentication while writing to Azure Data Lake Storage Gen 1.

Previously, users depended on Azure Active Directory based authentication for this purpose, which had several limitations.  For instance, users will now be able to automate their Stream Analytics pipelines through PowerShell. Secondly, this allows users to have long running jobs without being interrupted for sign-in renewals periodically. Finally, this makes user experience consistent across almost all ingress and egress services that are integrated out-of-the-box with Stream Analytics.

Azure Stream Analytics, Azure Certification, Azure Learning, Azure Study Materials, Azure Guides

Azure Stream Analytics, Azure Certification, Azure Learning, Azure Study Materials, Azure Guides
Configuring MSI based authentication to Data Lake Storage

Tuesday, 3 July 2018

Migrating IoT Edge solutions to GA

This release adds tons of new features for those already using public preview bits. Customers who have never used Azure IoT Edge can start with the Linux or Windows quickstarts. Those who have started projects on preview bits should upgrade to the latest bits and integrate breaking changes.  Details on both of these processes are below.

Upgrade to the latest bits


Uninstall preview bits

Use iotedgectl to uninstall the preview bits from your Edge device by running the following command. You can skip this step if you are installing GA bits on a device or VM that has never run preview bits.

iotedgectl uninstall

Delete preview runtime container images

Use “docker rmi” to remove the container images for preview versions of Edge Agent and Edge Hub from your Edge device. You can skip this step if you are installing GA bits on a device or VM that has never run preview bits.

Remove references to preview container images in deployments

The IoT Edge Security Daemon includes functionality to allow the user to specify which versions of the Edge Agent and Edge Hub are used by an Edge device. This allows you to control when bits on your device get updated. The version of these runtime components are specified in your deployment. Deployments created during preview include references to preview container images. You must change them to reference GA container images, ensuring that the Security Daemon does not downgrade your machine to preview bits. An example deployment snippet is below.

"systemModules": {
     "edgeAgent": {
         "type": "docker",
         "settings": {
             "image": "microsoft/azureiotedge-agent:1.0-previewmcr.microsoft.com/azureiotedge-agent:1.0",
             "createOptions": "{}"
         }
     },
     "edgeHub": {
         "type": "docker",
         "settings": {
             "image": "microsoft/azureiotedge-hub:1.0-previewmcr.microsoft.com/azureiotedge-hub:1.0",
             "createOptions": "{}"
         },
         "status": "running",
         "restartPolicy": "always"
     }
},

Install GA bits


The quickstarts for Linux and Windows walk you through setting up an IoT Edge device with these new infrastructural components. You will notice that the quickstarts start by installing a component introduced with GA called the IoT Edge Security Daemon. This is part of the Edge Security Manager, a set of components that help us secure the edge.

The IoT Edge Security Daemon starts each time an Edge device boots and initiates the IoT Edge Agent. This means that iotedgectl is no longer needed when working with IoT Edge devices.

Hsmlib is another new component which is part of the IoT Edge Security Manager. Different versions of hsmlib integrate with device specific hardware and allow the IoT Edge Security Daemon to perform cryptographic functions. Advanced types of hardware enable IoT Edge to provide innovative security and functionality. For example, devices with a hardware security module (eg. TPM) can be provisioned via the Device Provisioning Service.

Azure Study Materials, Azure Certifications, Azure Tutorials and Materials


A default version of hsmlib is installed along with the IoT Edge Security Daemon. This version of hsmlib supports our standard security promise which uses the Edge device’s file system for storage of secrets as well as our secure enclave promise using TPM at /tpm0.

Integrate breaking changes


The vast majority of feature additions have been done in a backwards compatible manner making it easy to migrate your projects on to GA bits. We’ve learned an incredible amount from having the product in public preview and are making a few small breaking changes based on user feedback. These changes are as small as possible while still delivering usability improvements to the product.

Client SDK split


Change

The DeviceClient object, used to write IoT Edge modules, will split into two objects: DeviceClient and ModuleClient.

Impact

Small code change. You must change the class name for all DeviceClient objects in your module code to ModuleClient. We've also introduced a factory class for creating ModuleClients. This helper class hides the details of correctly integrating with the IoT Edge Security Daemon.

Before

DeviceClient deviceClient = DeviceClient.CreateFromConnectionString(connStr, transportSettings);

After

ModuleClient moduleClient =
ModuleClient.CreateFromEnvironment(transportSettings);

Additionally, the API surface of the ModuleClient will be reduced to only the functionality that is currently supported for modules. This should not affect any existing module code as those APIs do not work.

Reason for change

The functionality available to devices and modules is slightly different. For example:

◈ Modules can read/write from inputs/outputs while devices can only write to IoT Hub.
◈ Devices can receive C2D messages while modules cannot.
◈ Devices can perform FileUpload while modules cannot.
◈ Separating the clients cleanly delineates the functionality available to devices and modules.

Direct method invocation


Change

The ServiceClient class will no longer allow module identities to invoke direct methods. This functionality has been moved to the ModuleClient class as part of the Client SDK split detailed above.

Impact

You must use the ModuleClient class to invoke methods for module identities instead of the ServiceClient.

Before

// assumes the use of a module connection
ServiceClient serviceClient = ServiceClient.CreateFromConnectionString(connStr, transportSettings);
string.serviceClient.InvokeDeviceMethodAsync("DeviceId", "ModuleId", new CloudToDeviceMethod("MethodName"));

After

ModuleClient moduleClient = ModuleClient.CreateFromEnvironment(transportSettings);
moduleClient.InvokeMethodAsync("DeviceId", "ModuleId", new DirectMethodRequest("MethodName")) ;

Reason for change

Invoking a method is an action users expect to be able to do with the ModuleClient, instead of instantiating a second object. Adding method invocation to ModuleClient reduces the size of module code as developers no longer need to include the ServiceClient in their code.

Configure port on which Edge Hub listens


Change

Edge Hub used to listen for connections from downstream devices on ports 8883 and 443. Now we require developers to specify on which host ports it uses for these connections.

Impact

You must set port forwarding info on Edge Hub when manually creating a deployment's JSON. The below JSON configures Edge Hub to work as before (listen on ports 8883 and 443).

"{\"HostConfig\":{\"PortBindings\":{\"8883/tcp\":[{\"HostPort\":\"8883\"}],\"443/tcp\":[{\"HostPort\":\"443\"}]}}}"

You do not need to make any changes if you're creating a deployment through the Azure portal or with the VS Code extension. These tools will automatically set up port forwarding options to make Edge Hub listen for connections on 8883 and 443.

Reason for change

Some customers may want an Edge device to listen for downsteam device connections on a port other than 443 or 8883 as those ports could be used for a different purpose. For example, customers may want to run a HTTPS server on 443.

Thursday, 21 June 2018

Enabling Smart Manufacturing with Edge Computing

Smart Manufacturing envisions a future where factory equipment can make autonomous decisions based on what’s happening on the factory floor. Businesses can more easily integrate all steps of the manufacturing process including design, manufacturing, supply chain and operation. This facilitates greater flexibility and reactivity when participating in competitive markets. Enabling this vision requires a combination of related technologies such as IoT, AI/machine learning, and Edge Computing. In this article, we will introduce Edge Computing and discuss its role in enabling Smart Manufacturing.

What is Edge Computing?


Put simply, Edge Computing is about taking code that runs in the cloud and running it on local devices or close to it. Like in a gateway device or a PC sitting next to the device.

To understand Edge Computing it helps to think of an IoT solution as generally having three components:

◈ Things like IoT devices, which generate sensor data.
◈ Insights you extract from this data.
◈ Actions you perform based on these insights to deliver some sort of value.

With Edge Computing, you move the insights and actions components from the cloud to the device. In other words, you bring some of the code used to process and extract insights from the data, and perform some action in response to it right into the device.

Internet of Things, Machine Learning, IoT Edge, Azure Study Materials

Edge Computing model

You may still want to send data to the cloud for further processing and actions, but now you have the option of balancing your workloads more efficiently based on latency, compute power, or data privacy requirements. Notably, one of the things we can move to the edge is machine learning and AI algorithms. When we do this, we say that we have an intelligent edge.

When is Edge Computing useful?


Edge computing is useful in the following situations frequently found in manufacturing:

◈ Mission-critical scenarios where you need real-time insights for quick decision making. Basically, when the electrons cannot travel fast enough to the cloud and back, to do what you need to do. Think about an autonomous vehicle approaching a red light. It can’t afford the second delay to route the message to the cloud and wait for a command back. When the car sees a red light, it needs to perform analytics in a closed loop and stop immediately. Increasingly, these insights are being provided by machine learning models running on the device, performing predictive analytics, anomaly detection, image recognition, and classification.
◈ Remote sites where cloud connectivity is not stable or economical. For example, remote oil fields send data to the cloud only twice a day via satellite connection.
◈ Geographies where compliance and data residency are critical. You can pre-process sensitive data locally within a sovereign or organizational boundary.
◈ Avoiding transferring terabytes of raw data between devices and the cloud to reduce bandwidth costs. You can perform data cleaning and aggregation locally.

Internet of Things, Machine Learning, IoT Edge, Azure Study Materials

Remote oil fields are prime candidates for Edge Computing

How does Edge Computing compare to private cloud?


Private cloud is essentially a model of cloud where IT services are provisioned over private infrastructure for the dedicated use of the organization. Private cloud is best suited where you have cloud applications or application components that need to be deployed on-premises to overcome latency, connectivity, or regulatory requirements like when you need to isolate your environment from the public cloud.

Edge Computing, on the other hand, is about running computations on the device, and communicating to the cloud when needed. Now, the cloud these devices communicate with can be the public or private cloud. The device doesn’t really care.

Edge and private cloud can be operating in the same environment. A good example would be a cruise ship where passenger services and navigation apps would run on the private cloud, and engine maintenance would run on the edge.

Now, to make things more interesting, there is also the option of running a hybrid cloud. Which is a combination of private and public. In fact, many companies run their cloud environments this way because it gives them the best of both worlds. The latency and privacy of the private cloud, with the scalability and economies of scale of the public cloud. So, as you can imagine, you can have scenarios where you are running a hybrid cloud environment, with some things running in the public cloud and some things running in the private cloud. In addition, your IoT devices are running Edge Computing with some things running on the devices and some things running on the cloud.

Internet of Things, Machine Learning, IoT Edge, Azure Study Materials

Hybrid Cloud and Edge Computing Scenario

How does Edge Computing enable Smart Manufacturing?


The smart part of smart factory is about autonomy. The ability of some asset or piece of equipment to make decisions based on what’s going on in the factory floor without the need of human intervention, these are decisions that are not necessarily pre-programmed in some factory control system. The aim of Smart Manufacturing is to utilize a more programmatic data-led approach to develop new and higher quality goods faster. Edge Computing can enable this autonomy where machines in the factory floor extract insight and formulate actions at near real-time. Running AI/machine learning algorithms in their own electronics, almost as if they had their own brains!

Consider this example where you have two robots that are performing some task. The robots are connected to an edge device that is running a machine learning model listening to sensor data from the robots and whose mission is to predict an impending failure.

Internet of Things, Machine Learning, IoT Edge, Azure Study Materials

When the machine learning model determines that one of the robots is about to fail, it triggers actions. These actions may include stopping or slowing down the line that feeds the robot in trouble, and/or triggering a process in the cloud to create a service request in some line-of-business application.

What does Azure offer to enable Edge Computing?


The service Azure offers to enable Edge Computing is called Azure IoT Edge. IoT Edge is made up of three components:

◈ Azure IoT Edge modules are containers that run Azure services, 3rd party services, or your own code. They are deployed to IoT Edge devices and execute locally on those devices.
◈ The Azure IoT Edge runtime runs on each IoT Edge device and manages the modules deployed to each device. An interesting fact is this runtime will be open sourced to the developer community so that they can make changes and additions to it.
◈ A cloud-based interface enables you to remotely monitor and manage IoT Edge devices.

Internet of Things, Machine Learning, IoT Edge, Azure Study Materials

Monday, 27 November 2017

Azure Stream Analytics now available on IoT Edge

We are announcing the public preview of Azure Stream Analytics running on Azure IoT Edge. Azure Stream Analytics on IoT Edge empowers developers to deploy near-real-time analytical intelligence closer to IoT devices so that they can unlock the full value of device-generated data. Designed for customers requiring low latency, resiliency, efficient use of bandwidth and compliance, enterprises can now deploy control logic close to the industrial operations and complement Big Data analytics done in the cloud.


Azure Tutorial and Material, Azure Guides, Microsoft Live

Why put analytics closer to the data?


With Azure Streaming Analytics (ASA) on IoT Edge, enterprises benefit from running Complex Event Processing (CEP) closer to where the data is produced, in the following scenarios:

◉ Low-latency command and control: For example, manufacturing safety systems are required to respond to operational data with ultra-low latency. With ASA on IoT Edge, you can analyze sensor data in near real time and issue commands when you detect anomalies to stop a machine or trigger alerts.

◉ Limited connectivity to the cloud: Mission critical systems, such as remote mining equipment, connected vessels or offshore drilling, need to analyze and react to data even when cloud connectivity is intermittent. With ASA, your streaming logic runs independently of the network connectivity and you can choose what you send to the cloud for further processing or storage.

◉ Limited bandwidth: The volume of data produced by jet engines or connected cars can be so large that data must be filtered or pre-processed before sending it to the cloud. Using ASA, you can filter or aggregate the data that need to be sent to the cloud.

◉ Compliance: Regulatory compliance may require some data to be locally anonymized or aggregated before being sent to the cloud. With ASA, you can aggregate data coming from various sources, or in a given time window, for example.
During the private preview of ASA on IoT Edge, we received positive feedback validating the use of ASA for these scenarios. Hiroyuki Ochiai, Director of the IT platform division for NEC Corporation said, “Azure Stream Analytics on IoT Edge increases the responsiveness of IoT solutions, while ensuring data privacy and sovereignty by processing data locally on IoT Edge. We see great potential to use this service across both our own IoT solutions, and those of our customers who benefit from NEC’s Azure Plus consultancy." 

Move between edge and cloud easily


With ASA on IoT Edge, you can easily use CEP for your IoT scenarios using the same interface and the same SQL-like language for both cloud and edge analytics jobs. This makes it easy to move analytics between edge and cloud. Our SQL language notably enables temporal-based joins, windowed aggregates, temporal filters, and other common operations such as aggregates, projections, and filters.

ASA on IoT Edge offers a cross-platform solution running on Docker containers that can be deployed on multiple platforms (Linux or Windows) and multiple architectures (Intel or ARM CPUs). This allows ASA to run on a large variety of devices, from small-footprint devices such as Raspberry Pi to industrial PCs, dedicated field gateways, or servers.

By leveraging Azure IoT Edge to secure, deploy and manage your IoT solutions from the cloud, you can easily deploy Azure Stream Analytics to thousands of devices.

Get started now


ASA on IoT Edge preview is being deployed and will be enabled to all customers by Friday, November 17, 2017. For more information, refer to the ASA on IoT Edge documentation, or go directly to the Azure portal to create a new ASA job and just select “Edge” as the hosting environment. An end-to-end tutorial is also available for a quick start.

Azure Tutorial and Material, Azure Guides, Microsoft Live