Deploy bot application to Azure and register in Microsoft Bot Framework

Create your first chat bot application using Microsoft Bot Framework

Publish Application to Azure

  • Right click on the web application and click publish


  • Create Microsoft App Service
  • Select Microsoft Azure App Service
  • Click publish


  • Type App name, choose subscription, resource group, app service plan
  • Click create


  • Application is deployed to azure and you will see the following page in the browser
  • You will see the azure application url
  • You will get message end point


Register Bot with Microsoft Bot Framework

  • Go to
  • Click to My bot -> Create bot
  • Click Create button
  • You will redirect to Azure portal
  • Click “Web App bot” and then click create


Create new bot service

  • From Web App Bot -> Click the “Create button” to start the bot creation process.
  • Type bot name
  • App name
  • You can keep other option by default or change as your own



Obtain Microsoft App Id and Microsoft App Password

  • Click settings of newly created bot service
  • Change message end point


  • Click Mange (beside Microsoft App id) link
  • Click Generate new password and obtain new Microsoft app id and password and then save


  • Now open web.config of bot application and add Microsoft App Id and password.
  • Publish application again to azure


  • Now go to Web App bot in Azure
  • Click Test in Web Chat
  • In the right side window. Type some message and enter.
  • You will see the following reply.


If it replied above message, means its ok. Cheers !!

Software Review – Stellar Phoenix SQL Database Repair

In this fast-evolving Digital Environment, Corruption is the most common occurrence. It mutely strikes at any instant and takes a toll on transaction, performance, and database availability. The reason for the SQL database SUSPECT condition can be anything including Application Crash, Improper Shutdown to the Missing Transaction Log. This action is potential enough to thwart the production badly.

Therefore, to counterbalance its impact and repair affected SQL Database, precise Recovery is the dire need. The best approach to cater to this request is by employing a Repair tool that combats the corruption and repairs the Database efficiently.

Thinking to buy a reliable Repair tool? Jumbled in the tons of options? Wondering which option is right for you? Here, Stellar Phoenix SQL Database Repair would serve the purpose. It has been self-evaluated, experimented, and approved. This fast and powerful tool is also the first and foremost choice for several professionals.

Stellar Phoenix SQL Database Repair: Transparent Analysis

Product Details

  • Product Name: Stellar Phoenix SQL Database Repair
  • Version: 8.0
  • Type: Do-it-yourself
  • Language Support: English
  • Limitations: NA

Minimum System Requirement:

  • Processor: Pentium Class
  • Operating System: Windows 10, 8, 7, Vista, XP and Windows Server 2008, 2003
  • Memory: At least 1 GB
  • Hard Disk: Minimum of 50 MB
  • Version Support: MS SQL Server 2016, 2014, 2012, 2008 R2, 2008, 2008*64, 2008 Express, 2005, 2005*64, Express, 2000, 2000*64, 7.0 and mixed formats

Software Versions:

Demo Version

  • Intended for evaluation purpose
  • Enables to view only MDF files preview

License Version

  • Facilitates saving
  • Permits you to take advantage of all features

Brief Outline

An impressive do-it-yourself SQL Recovery software intended to fight back almost all SQL Server database damage or corruption scenarios including unexpected system shutdown, virus attack, to media read error. Further, recovers inaccessible MS SQL Server database files—MDF and NDF.

Backed by powerful non-destructive repair algorithms, this dedicated solution promises 100% database integrity assurance during repairing as well as recovering. With hands on this tool, you can safely recover tables, rules, indexes, triggers, keys, and defaults. The best thing about this software is its ability to recover even heavily damaged files seamlessly.

Key Features:

  • Supports recovery for deleted records
  • Capability to store repaired database to the Live database
  • Capability to save repaired database into CSV, HTML, and XLS format

Prominent Features:

  • Fast scanning algorithms
  • Facilitates Recoverable Database Objects Preview
  • Aids Object Name Search in tree view
  • Facilitates creation of Sorted tables in tree view
  • Prepare distinct log report after scanning database
  • Facilitates auto new database recreation
  • Option to save the scanned result automatically

Support Options

  • SQL Server Large MDF and NDF files
  • MS SQL Server Sequence Objects
  • Standard Compression Scheme for Unicode
  • MS SQL Server ROW Compressed data
  • MS SQL Server PAGE Compressed data
  • XML data types, XML indexes
  • SQL Server File stream data types, sparse columns, columns set property

Recovery Options

  • Column Row GUID COL Property
  • Defaults and Default constraints
  • Sp_addextended Property
  • Stored Procedure, Synonyms, and Functions
  • Tables, Identity Triggers, Indexes, Collations, and Views
  • Predefined defaults, default values, and Rules
  • Primary, Foreign, and Unique Keys
  • Check constraints, Null / Not Null, and User Defined Data Types

Positive Traits

  • Secure
  • Reliable
  • Easy to use
  • Straightforward
  • Simple user-interface
  • Ensures Data Integrity

ROW and PAGE Compressed Data Recovery Is In frame

The most distinguish feature of this software is its ability to recover SQL tables with PAGE and ROW compression. It is a much-demanded need by many users. In addition, it offers support for SQL Server 2008 R2 SCSU.

Powerful Algorithms to Safeguard Data Integrity

Thanks to its powerful algorithms, the top-most concern for every individual—Data Integrity is always maintained. This software comprehensively scans MDF files and efficiently recover as much data as possible

Deleted Record Recovery Is No More Hassle

This software enables you to recover corrupt database deleted records effortlessly without any alteration in original hierarchy. After recovery, you can easily save them in the newly created table.

Multiple Saving Options for Added Convenience

This powerful software has programmed to offer as much ease as much possible. Thereby, to provide utmost comfort, it comes with multiple saving options. You can choose the desired option to save the repaired SQL Server database. The hidden secret of this feature is that you do not require SQL Server on your system to access the file.

All Database Components Recovery Is In Frame

Another efficient feature speaking of its diverse nature. It lets you recover almost everything including column set property, Keys, Rules, and Indexes, to Stored Procedures in a hassle-free manner.

Selective Recovery is No More Tedious Task

With hands on this software, you can effortlessly perform selective database objects recovery. It allows you to choose desired database objects for recovery and save them at a specific location.

Disruption Is No more a Hindrance

Another quite significant feature of this software is its reconnection ability to Server automatically, in the case of interruption while repairing. Thanks to this feature, you can repair smoothly.

How does it Work?

The functionality of this software is very simple and straightforward. Simply follow the stepped instructions.

Steps to Repair and Recover are as follows:

    1. Download, install and launch Stellar Phoenix SQL Database Repair software using the activation key
    2. Click Select Database -> Select the database for recovery (In case, you are unaware of the exact destination, Click Find Database ->Folder -> Search)


    1. Click Repair
    2. All repaired database objects will enlist in the left pane
    3. Click desired object to preview its data in right-pane


    1. Now, save the repaired database. Click on the Save button
    2. You have 4 options to save the repaired database


    1. Here, I am choosing MSSQL option


    1. You can see New database and Live database I selected the New database option and save the repaired database. Click Browse and state destination detail
    2. On connection, click Connect
    3. On generation of File saved at the desired path dialog box, click OK


The Repaired and Repaired process is complete.

 Final Verdict

Every Database users’ searches for a recovery tool on which they can ultimately rely for resolving their both day-to-day and severe database corruption issues effortlessly. However, this smart software: Stellar Phoenix SQL Database Recover has all the unique traits to work efficiently in almost all corruption cases. Moreover, it is an edge over other humble competitors in terms of ease-of-use, scanning performance, flexible options, technical support and much more. Personally, my rating for this software is 9 out of 10. Try it!


Create your first chat bot application using Microsoft Bot Framework

What is chatbot?

  • Bot is an automated software designed by human programmers to do tasks
  • Chatbot is an automated software to talk customer using messaging apps

Why Chatbot?

  • One to one messaging with thousands of customers at a time
  • Available in 24/7
  • Lots of application in real world
    • Bank and Insurance
    • HR issues
    • Ordering Pizza
    • Customer support
    • Personal Finance Assistant
    • Schedule Meeting
    • Product Suggestions
    • Weather forecasting etc.

What is Microsoft Bot Framework?

  • A platform for building, connecting, testing and deploying powerful and intelligent bots
  • Open source
  • Connect cross platform with the flip of a switch

Installation Requirements

  • Visual Studio 2015 / 2017 (In this demo I use Visual Studio 2017)
  • Download Bot template: Visual Studio Bot Template – C# ( )
  • Save the zip file to Visual Studio 2015/2017 template directory
    “%USERPROFILE%\Documents\Visual Studio 2017\Templates\ProjectTemplates\Visual C#”
  • Create First Application
    • Start Visual Studio 2017
    • From the file -> New->Project
    • Select Visual C# template
    • Select Bot Application
    • Enter project name (For example DigitalAssistant)
    • Browse save location
    • Click OK


    • A bot application with default structure will be created.
    • Default project structure of a bot application shown below


    Now run your application, you will see the following screen.

    If you see the above screen, means your application is created successfully.

    Test your project using bot emulator

    • Download bot emulator from the following link
      Install bot emulator
    • Test your project by typing http://localhost:3979/api/messages in the emulator url
    • Click Connect button of the emulator
    • Now type some message in the emulator
    • You will see the following screen
    • If you see the reply message like this, it means its means it works properly.


    Next: Deploy bot application to Azure and register in Microsoft Bot Framework

    A simple data science experiment with Azure Machine Learning Studio

    What is Machine Learning, data science and Azure Machine Learning Studio?

    • Machine Learning is concerned with computer programs that automatically improve their performance through experience. It learns from previous experience or data.
    • Data science, also known as data-driven science, is an interdisciplinary field about scientific methods, processes, and systems to extract knowledge or insights from data in various forms, either structured or unstructured, similar to data mining. (Wikipedia)
    • Azure Machine Learning Studio is a tool that uses to develop predictive analytic solutions in the Microsoft Azure Cloud.

    Experiment Overview
    Azure Machine Learning Studio is an excellent tool to develop and host Machine Learning Application. You don’t need to write code. You can develop an experiment by drag and drop. Here we will create a simple Machine Learning experiment using Azure Machine Learning Studio.

    Tools and Technology used

    1. Azure Machine Learning Studio

    Now create our experiment step by step

    Step 1: Create Azure Machine Learning Workspace

    • Go to and log in using your azure credential
    • Click More Services from left panel of azure portal
    • Click “Machine Learning Studio Workspace” under “Intelligence + Analytics” category
    • Add a work space by clicking add (+) button at the top left corner
    • Choose pricing tire and select. Figure shows pricing tire below.
    • Finally click create button



    Step 2: Launch Machine Learning Studio

    • Click Launch Machine Learning Studio to launch machine learning studio
    • Then login to the portal


    Step 3: Create a blank experiment

    • Select Experiment Menu. Then click New (+), at the bottom left corner.
    • Click Blank Experiment. In addition to blank experiment there are many other sample experiments. You can load and modify the experiment.
    • Once the new blank experiment has been loaded, you will then see the Azure ML Studio visual designer as follows.



    Step 4: Add data set in the ML Studio visual designer

    • You can import data set or can use saved data set. In this case we use saved sample dataset.
    • Click Saved Datasets from left top corner.
    • Drag and drop “Adult Census Income Binary Classification dataset” from Saved Datasets -> Sample


    Step 5: Select columns in dataset

    • Expand Data Transformation -> Manipulation
    • Drag and drop “Select Columns in Dataset” to the visual surface
    • Connect the “Dataset” with “Select Columns in Dataset” in visual surface
    • Click the Select Columns in Dataset
    • Click Launch column selector in the property pane
    • Select “WITH RULES”
    • Add age, education, marital-status, relationship, race, sex, income columns and finally click tick mark of the bottom right corner.



    Step 6: Split up the dataset

    • Split your input data into two – Training data and Validation data
    • Expand “Data Transformation” -> “Sample and Split” from left pane
    • Drag and drop Split Data to Azure Machine Learning Studio visual surface
    • Connect the split module with “Select Columns in Dataset” in visual surface
    • Click the Split module and set the value of the Fraction of Rows to 0.80 in the right pane of the visual designer surface. This means 80 percent data will be used for training and rest of the data will be used for validation.


    Step 7: Train the model

    • Expand “Machine Learning” -> “Train” from left pane
    • Drag and drop “Train Model” to Azure ML Studio visual surface
    • Connect split dataset1 to train model (second point of train model as figure below)
    • Expand Machine Learning -> Initialize Model -> Classification from left pane
    • Drag and drop “Two-Class Boosted Decision Tree” as shown figure
    • Connect “Two-Class Boosted Decision Tree” to Train Model (first point of train model as figure below)


    Step 8: Choose columns for prediction

    • Click the Train Model
    • Click “Launch column selector” in the property pane
    • Select Include and add column name “Income”. Because this experiment will predict income.
    • Click tick mark on the bottom right corner


    Step 9: Score the model

    • Expand “Machine Learning” -> “Score”
    • Drag and drop “Score Model” to the visual design surface.
    • Connect Train Model to Score Model (first point of Score Model as figure below)
    • Connect “Split” to “Score Model” (second point of Split with Second point of Score Model as figure below)


    Step 10: Evaluate the model

    • Expand “Machine Learning” -> “Evaluate”
    • Drag and drop “Evaluate Model” to the visual design surface.
    • Connect “Score Model” to “Evaluate Model” (first point of Evaluate Model as figure below)
    • Now click “Run” at the bottom of the Azure ML Studio. After processing, if you see each stage marked as green, means its ok.
    • After completing process, right click on the Evaluate Model -> Evaluation Result -> Visualize
    • You will see the accuracy curve as shown below.
    • Click Save As at the bottom of the screen




    Step 11: Setup a web service

    • Click Setup Web Service -> Predictive Experiment
    • Connect Web Service Input to Score model (As shown below figure)
    • Select “Column in Dataset”, remove income column from dataset. Because model is now ready to predict income.
    • Save and run the model from bottom of the ML studio




    Step 12: Deploy Web Service

    • Click Deploy Web Service -> Deploy Web Service [Classic] from the bottom of ML Studio
    • After completing deployment process, you will see a dashboard. Here you will see different documents to test and consume services as shown below
    • Click “Test Button” from the Dashboard
    • You will see a popup dialog to take input
    • Type input as like below and Click Tick mark
    • You will see desired output as like figure. Here you see income > 50K




    Now you have developed a simple data science experiment. You can now embed this with your application. API links, security key and necessary document is given in the dashboard.


    Access a comprehensive portfolio

    Find the products, services, and solutions you need to make the most of IoT business opportunities across devices, cloud, analytical capabilities, and business systems.

    Rely on a commitment to IoT

    Get more than an IoT vision. Because we’ve been investing in the Internet of Things before it was even called that, you can rely on a commitment to bring support and rapid innovation to your solutions, helping you stay ahead of the competition.

    Bring IoT to any device, any platform

    Deliver a flexible, scalable solution that adapts to your needs and processes. Connect to your choice of devices and operating systems, while using your existing infrastructure.

    Get trusted support

    Trust decades of experience and security working with companies like yours. Support your solution with enterprise technology designed for the needs of business, as well as our vast network of partners.

    How industries are using the Internet of Things

    Azure IOT suite helps sandvik coromant stay on cutting edge within “ digital manufacturing”

    Technology has enabled some pretty amazing things, but being in two places at once still isn’t one of them. Fortunately, the Internet of Things (IoT) has enabled companies to use and share their expertise more efficiently, increasing productivity without needing to increase manpower.

    Take Sandvik Coromant, for example, part of the global industrial group Sandvik. Sandvik Coromant as a global market leader has developed extensive know-how within tooling and the manufacturing industry over many decades. Since the emergence of digital solutions within manufacturing, Sandvik Coromant successfully transferred this knowledge also to the so-called “digital manufacturing”.

    Sandvik Coromant has always been committed to the pursuit of technological development, and believes in working closely with manufacturing customers to provide reliable tools and tooling solutions. That dedication is embodied by a team of “yellow coats,” technical experts with extensive expertise who provide training and troubleshooting to customers in more than 150 countries. In addition to helping customers remotely and at Sandvik Coromant Centers, “yellow coats” also conduct on-site visits, where they can make adjustments as needed and provide recommendations to improve the customers’ manufacturing process.

    Continued growth created a new challenge for Sandvik Coromant, however—even its “yellow coats” couldn’t be everywhere at once. The question was how to scale the team’s services quickly, without having an impact on quality?

    oT provided the answer. Advances in composite materials, and the benefits of sensors and other IoT technologies have prompted many manufacturers to retool for the realities of “Industry 4.0”. Sandvik Coromant has taken this opportunity to create a scalable service model that delivers the same world-class quality of service and technical expertise that its customers are used to, still having “yellow coats” available when needed.

    Using Azure IoT Suite, Cortana Intelligence and Dynamics 365, Microsoft helped Sandvik Coromant to develop its service model with a predictive analytics solution that ties all of the elements of the supply chain and fabrication process together.

    IoT Suite collects, computes and analyzes data from sensors embedded in all of the tools across the shop floor, monitoring every aspect of their performance, as well as the existence of any bottlenecks in the overall supply chain or manufacturing. Then, with Cortana Intelligence, Sandvik Coromant takes that analysis and makes recommendations on how to optimize the manufacturing process, and creates a predictive maintenance schedule that’s designed to help avoid unscheduled shutdowns. Finally, the solution integrates master data from the CRM system with meta data from the shop floor system and the machining system and makes them available through CRM to Sandvik Coromant which can then provide feedback and then support in predicting when to change or order a tool.

    “Through our close partnership with Microsoft, we have developed the new predictive analytics manufacturing solution connects an in-house shop floor control tool that collects all the information, such as machine data, tool data, and sends it to Azure for real-time analysis using Machine Learning algorithms to optimize the process in real-time and set up predictive maintenance schedules and set alarms so that we can know when to take a machine offline before a failure occurs. In the end, our customers will be able to make quicker and better informed decisions to become more profitable” says Nevzat Ertan, Chief Architect & Senior Manager, Sandvik Coromant.

    With this technology, Sandvik Coromant has digitized its deep expertise and provides services that help customers make more informed decisions, and more easily calculate the financial return on a new machining tool. That translates to additional revenue, happier customers and greater flexibility in how its technical experts connect with customers.

    Schneider Electric harnesses the sun to power remote Nigerian schools and clinics

    The average Nigerian can count on having electricity only a few hours a day, if at all. But for 11 communities, there’s now one place they know the lights will always be on: their local health clinic.

    And at 172 schools around the state of Lagos, students now not only have access to computers, they can even charge headlamps to use for studying back in their darkened homes in the evenings.

    That’s thanks to high-tech, self-contained solar systems put together with technology from Schneider Electric and funded by the U.K. Department for International Development (DFID) and the state government of Lagos. TheLagos Solar project uses batteries that are charged by solar panels, along with intelligent inverters connected to Microsoft Azure IoT technology that not only convert the battery power into usable electricity but also allow for remote monitoring and maintenance.

    It’s a vast improvement over the unreliable and polluting diesel generators most Nigerians are stuck with. The program is expanding to 270 schools and will benefit 190,000 students and 4.7 million patients by 2020, creating more than 3,000 jobs, according to the DFID.

    “This project is about powering schools and clinics, but when you bring electricity to communities, there’s a lot more upside than just things like lights and TVs, because now they’re able to power pumps for drinkable water, too,” says Xavier Datin, vice president of Schneider Electric’s solar line of business. “It’s about economics and business, but when you can bring electricity and really help people so obviously, it feels good.”

    The African country’s power woes are well known. The infrastructure in the country of 179 million only has the ability to produce enough energy to power a city about the size of Halifax, Canada, home to fewer than half a million. The Africa Progress report 2015found that 65 percent of Nigeria’s primary schools lack access to electricity. And the country has the world’s highest concentration of diesel generators, which are not only unreliable but generate pollution, both from operating them and from transporting fuel to the villages.

    Sunshine, however, is a clean power that Nigerians can rely on. Schneider’s standalone solar systems are able to take advantage of that resource by not only harnessing the energy but storing it for use when the sun goes down.

    The project has had a huge impact on the quality of education for Nigerian pupils, “increasing their zeal towards learning” and making them more aware of alternative energy as well, says Damilola Makindipe, the head of solar projects for the Lagos State Electricity Board. The clinics have been able to provide more and better healthcare to everyone in the chosen communities while reducing the expenses and the air and noise pollution involved with generators, she says.

    “The clinics are completely off-grid and have never had a power outage since commissioning,” Makindipe says. “People are confident that even if there’s no light anywhere in the community, there’s light at the family care centers and schools.”

    For remote sites like these that aren’t connected to a public electrical grid, there has to be a way to store solar energy so it can be used after sunset. That’s where the batteries come in. They’re each the size of a typical car battery, and they line the walls of the system’s container and get charged by the solar panels during daylight hours. Schneider’s suitcase-sized inverters then convert the batteries’ 48-volt energy into the typical Nigerian appliance’s 230 volts, running through power lines into the school or clinic to power everything from lights to laptops.

    Without this system, many of the country’s hospitals have had to rely on generators for power. If the fuel runs out or there’s another problem, they can be without electricity for 12 hours or more, which can be a grave lapse for severely ill patients needing urgent medical treatment.

    “Reliability is absolutely critical, and that’s why this solution is the most successful for remote applications like this program in Africa,” Datin says.

    It’s the ability to infuse the inverters with cloud-based intelligence that’s revolutionizing the industry and making the whole project possible, he says.

    The systems can be used anywhere, but the connectivity aspect with Schneider’s Conext Insight is particularly important for the remote schools and clinics in Nigeria. It’s difficult and expensive to send trained technicians to such rural sites to fix problems that pop up. But with the cloud-based remote monitoring in the Azure IoT Suite, a technician can be anywhere in the world and still download a necessary update to the firmware or notice that a certain level is getting low and be able to notify someone on-site to dust off the solar panel, for example. Without that element, clinics might not know anything was wrong until the power went out.

    “More and more this infrastructure equipment is not just physical hardware, but it’s run by software, and that software needs to be updated to keep an environment operating smoothly,” says Sharieff Mansour, the director of product management for Microsoft’s Internet of Things division. “Using Azure IoT Suite, Schneider will be able to connect the devices to the cloud for remote monitoring and push software down or address issues from any location, without the cost and delays of traveling to a site in Nigeria. You could be sitting here in Seattle and push those updates to Nigeria. That’s pretty powerful.”

    The system also collects data via Cortana Intelligence Suite from every unit analysis, identifying trends so technicians can address issues before they lead to outages. For example, previous history might show that a certain drop in electricity generated by a solar panel may indicate that a panel needs to be cleaned or a battery checked within 12 hours or it could fail. The analytics allow remote monitors to help proactively ward off those types of problems.

    The project is helping France-based Schneider, already a global powerhouse with revenues of $30 billion and 170,000 employees serving customers in more than 100 countries, expand its reach to work with consumers as well as the traditional commercial customers that make up the bulk of its business, Datin says.

    “Solar energy is not only renewable and carbon neutral, but with this system you can use it exactly where you need it, so you don’t incur a loss on producing, generating and transmitting that power,” Datin says. “That’s a big advantage. And economically, you can afford to produce electricity in remote places that would be very difficult to power if you had to run power lines to the locations or if you had to get diesel there for generators.

    “It’s changing the world,” he says.

    Discrete Manufacturing Solutions

    Drive excellence with Microsoft in discrete manufacturing: we want to help you accelerate your digital capabilities to maximize value co-creation across your business network. Using Microsoft’s trusted cloud platform and enterprise industrial IoT ecosystem, disrupt your competitors and drive better results, today! Our industry solutions such as digital twin, predictive maintenance, asset management, remote monitoring and blockchain will empower you to transform your products and develop new business models and revenue streams. Our solutions are built with the latest technologies, including artificial intelligence, immersive human-machine interaction, cognitive services and bot frameworks. We can help you harness these transformative technologies to enable smart factories, optimize supply chain management, drive new levels of automation, or advance other business processes. Whether your business transformation in discrete manufacturing processes is within the automotive, high tech, aerospace, or industrial equipment industries, we want to partner with you. We can help you excel at digital so you can focus on what you do best: achieve unprecedented levels of operational, and business excellence. Let’s make Industry 4.0 a reality today, together!

    Predictive Maintenance solution

    The Predictive Maintenance solution gives you better visibility into equipment status, letting you resolve issues before they disrupt your business.

    Monitor your assets in near-real time by collecting data through Azure IoT Suite. This allows you to create automatic alerts and actions, such as remote diagnostics, maintenance requests, and other workflows.

    Then perform historical analysis of your data and predict when you need to service equipment.

    Remote Monitoring solution

    With the Remote Monitoring solution, you can monitor assets located nearly anywhere from afar. The solution helps you understand equipment conditions, enabling you to provide over-the-wire updates and fine-tune processes.

    To optimize business processes in the long term, the solution applies analytics techniques, like machine learning, to your data. The smart system performs in-operation analysis to find correlations across multiple data streams—letting you improve costs, uptime, and product quality. Plus, you can leverage new predictive maintenance programs to perform historical analysis of your data and resolve issues before they disrupt your business.


    Connected Field Service solution

    The Connected Field Service solution allows manufacturers to know about problems before the customer does and solve them at the smallest cost to the organization.

    In a simple scenario, where an abnormality is detected, a sensor sends an alert off to an Azure IoT Hub. This triggers a configurable workflow process within Dynamics 365 for Field Service.

    A field technician is dispatched then arrives onsite to resolve the problem, ensuring a first-time fix.

    This proactive approach improves customer satisfaction and resource productivity by catching issues and troubleshooting them remotely—before they significantly impact your business.


    Realize the potential of remote monitoring with IoT

    The promise

    Imagine if your assets had eyes and ears, and could talk to you in real time. That’s what IoT-driven remote monitoring offers. It involves collecting data from assets, and using that data to trigger automatic alerts and actions, such as remote diagnostics, maintenance requests, and other operational processes.

    IoT is a game-changer

    What used to be a manual, time-intensive procedure can now be dynamic, rapid, and automated. Now, assets located nearly anywhere can be monitored from afar. With live data from smart sensors and devices, organizations get better visibility into operational status, and can quickly, automatically respond to current conditions.

    Benefits of using Microsoft Azure IoT Suite

    Get started quickly with the remote monitoring preconfigured solution in the Azure IoT Suite to connect and monitor your devices in order to analyze untapped data and automate business processes.

    Start by determining the business objectives of your remote monitoring project. Examples include faster responses to equipment issues, or better insight into asset performance. The more specific you can be about the outcomes you want to achieve, the better. This is also a key part of the business case for the project.

    When you’ve identified a business process you want to improve, identify elements of the process that an IoT remote monitoring solution could address. This likely requires analysis of the end-to-end business process—how it works today, where the inefficiencies are, and what changes you want to make.

    For example, you might want a service alert or ticket to be created automatically if a temperature reading on a remote asset reaches a certain threshold. You’ll need to identify the systems, tools, and teams that would need to be involved in making that possible, the requirements that need to be met, and the gaps and obstacles that exist.

    This kind of analysis will help you determine the capabilities your solution must have, and will also indicate how extensive the business process changes might be. For example, if you want roaming maintenance technicians to receive real-time alerts of equipment problems, they need to be equipped with devices that deliver those alerts. And if you want technicians to respond immediately to alerts, their workflow will need to be adjusted to reflect that their priorities could dynamically shift if an alert comes in.

    Establish bi-directional communication with billions of IoT devices

    Rely on Azure IoT Hub to easily and securely connect your Internet of Things (IoT) assets. Use device-to-cloud telemetry data to understand the state of your devices and assets, and be ready to take action when an IoT device needs your attention. In cloud-to-device messages, reliably send commands and notifications to your connected devices—and track message delivery with acknowledgement receipts. Device messages are sent in a durable way to accommodate intermittently connected devices.

    Work with familiar platforms and protocols

    Add new IoT devices—and connect existing ones—using open-source device SDKs for multiple platforms, including Linux, Windows, and real-time operating systems. Use standard and custom protocols, including HTTP, Advanced Message Queuing Protocol (AMQP), and MQ Telemetry Transport (MQTT).

    Authenticate per device for security-enhanced IoT solutions

    Set up individual identities and credentials for each of your connected devices—and help retain the confidentiality of both cloud-to-device and device-to-cloud messages. To maintain the integrity of your system, selectively revoke access rights for specific devices as needed.

    Manage your IoT devices at scale with device management

    With new device management capabilities in IoT Hub administrators can remotely maintain, update, and manage IoT devices at scale from the cloud. Save time and cost by removing the task of developing and maintaining a custom device management solution or spending resources traveling to maintain global assets.

    Extend the power of the cloud to your edge device

    Take advantage of Azure IoT Edge to make hybrid cloud and edge IoT solutions a reality. IoT Edge provides easy orchestration between code and services, so they flow securely between cloud and edge to distribute intelligence across a range of devices. Enable artificial intelligence and other advanced analytics at the edge, reduce your IoT solution costs, ease development efforts, and operate devices offline or with intermittent connectivity.

    Common scenarios for Azure Functions

    Timer-based processing

    Azure Functions supports an event based on a timer using Cron job syntax. For example, execute code that runs every 15 minutes and clean up a database table based on custom business logic.

    Azure service event processing

    Azure Functions supports triggering an event based on an activity in an Azure service. For example, execute serverless code that reads newly discovered test log files in an Azure Blob storage container, and transform this into a row in an Azure SQL Database table.

    SaaS event processing

    Azure Functions supports triggers based on activity in a SaaS service. For example, save a file in OneDrive, which triggers a function that uses the Microsoft Graph API to modify the spreadsheet, and creates additional charts and calculated data.

    Microsoft Azure and Machine Learning Studio

    Microsoft Azure

    Microsoft Azure is collective brand name for Microsoft’s cloud computing services provide IaaS and PaaS service models. It covers a broad and still growing range of services that often form the foundational elements of cloud computing.

    Cloud computing is a term for computing resources and service such as server and network infrastructure, web servers and databases hosted by cloud service vendors, rented by tenants and delivered via the internet.

    Machine Learning

    Machine learning is a type of artificial intelligence (AI) that allows software applications to become more accurate in predicting outcomes without being explicitly programmed. The basic premise of machine learning is to build algorithms that can receive input data and use statistical analysis to predict an output value within an acceptable range.


    The processes involved in machine learning are similar to that of data mining and predictive modeling. Both require searching through data to look for patterns and adjusting program actions accordingly.


    Microsoft Azure and Machine Learning Studio

    Azure Machine Learning is a cloud predictive analytics service that makes it possible to quickly create and deploy predictive models as analytics solutions.

    Anyone can work from a ready-to-use library of algorithms, use them to create models on an internet-connected PC, and deploy your predictive solution quickly. Start from ready-to-use examples and solutions in the Azure AI Gallery.

    The main components of Azure Machine Learning are:

    • Azure Machine Learning Workbench
    • Azure Machine Learning Experimentation Service
    • Azure Machine Learning Model Management Service
    • Microsoft Machine Learning Libraries for Apache Spark (MMLSpark Library)
    • Visual Studio Code Tools for AI

    Together, these applications and services help significantly accelerate data science project development and deployment.

    This article defines and describes the concepts you need to know to use Azure Machine Learning.

    –  Subscription: An Azure subscription grants you access to resources in Azure. Because Azure Machine Learning is deeply integrated with compute, storage, and many other Azure resources and services, Workbench requires that each user has access to a valid Azure subscription. Users must also have sufficient permissions within that subscription to create resources.

    –  Experimentation Account: Experimentation account is an Azure resource required by Azure ML, and a billing vehicle. It contains your workspaces, which in turn contain projects. You can add multiple users, referred to as seats, to an experimentation account. You must have access to an experimentation account in order to use Azure ML Workbench to run experiments.

    –  Model Management Account A model management account is also an Azure resource required by Azure ML for managing models. You can use it to register models and manifests, build containerized web services and deploy them locally or in the cloud. It is also the other billing vehicle of Azure ML.

    –  Workspace: A Workspace is the primary component for sharing and collaboration in Azure ML. Projects are grouped within a workspace. A workspace can then be shared with multiple users that have been added to the experimentation account.

    –  Project: In Azure Machine Learning, a project is the logical container for all the work being done to solve a problem. It maps to a single file folder on your local disk, and you can add any files or sub folders to it. A project can optionally be associated with a Git repository for source control and collaboration.

    –  Experiment: In Azure ML, an experiment is one or more source code file(s) that can be executed from a single entry point. It may contain tasks such as data ingestion, feature engineering, model training, or model evaluation. Currently, Azure ML supports Python or PySpark experiments only.

    –  Model: In Azure Machine Learning, models refer to the product of a machine learning experiment. They are recipes that when applied correctly to data, generate predicted values. Models can be deployed to test or production environments, and used for scoring new data. Once in production, models can be monitored for performance and data drift, and retrained as required.

    –  Compute Target: A compute target is the compute resource that you configure for executing your experiments. It can be your local computer (Windows or macOS), Docker container running on your local computer or in a Linux VM on Azure, or an HDInsight Spark cluster.

    –  Run: The Experimentation Service defines a run as the lifetime of an experiment execution in a compute target. Azure ML captures information of each run automatically and presents the entire history of a particular experiment in the form of run history.

    –  Environment: In Azure Machine Learning, an environment denotes a particular computing resource that is used for deploying and managing your models. It can be your local computer, a Linux VM on Azure, or a Kubernetes cluster running in Azure Container Service, depending on context and configuration. Your model is hosted in a Docker container running in these environments and exposed as a REST API endpoint.

    –  Managed model: Model Management enables you to deploy models as web services, manage various versions of your models, and monitor their performance and metrics. Managed models are registered with an Azure Machine Learning Model Management account.

    –  Manifests: When the Model Management system deploys a model into production, it includes a manifest that can encompass model, dependencies, scoring script, sample data, and schema. The manifest is the recipe used to create a Docker container image. Using Model Management, you can auto-generate the manifests, create different versions, and manage these manifests.

    –  Images: You can use manifests to generate (and regenerate) Docker images. Containerized Docker images create flexibility to run them in the cloud, on local machines, or on IoT device. Images are self-contained, and include all dependencies required for scoring new data with models.

    –  Services: Model Management allows you to deploy models as web services. The web-service logic and dependencies are encapsulated into an image. Each webservice is a set of containers based on the image ready to service requests to a given URL. A web service is counted as a single deployment.

    Easy Solution On How to Stop Growing Log File Too Big

    In various organizations, huge SQL databases are equipped, which perform more than millions of transactions per hour. A SQL server database has data files and transaction log files. Data files store the user data and transaction log files store all the changes user made in the database and all the details of the transactions performed while making changes.

    Now, the issue is that this feature of logging the details, every time changes are made in SQL server can not be controlled or stopped. This causes a grave issue when the size of SQL server overgrows. However, the way in which these log files grow and configure can be controlled. So, to avoid SQL server log file growing unexpectedly, consider any of the following methods given below. Also, to manage the large area size, it is good to shrink log file size, we will discuss the ways to resolve the same issue in this content.

    SQL Server- Solutions to Stop Growing Log File Too Big

    There are numerous ways for truncating sql ldf too big file. Some of the chief solutions have been provided in the following segment of this content.

    • Monitor default Size Limit: In case SQL ldf file is growing too big then, put a large default size limit on the SQL server, so that it does not expands automatically and overloads the SQL server database.
    • Using the Memory Units: Remember to configure the expansion of log files by making use of the memory units instead of percentage if the SQL transaction log file grows quickly.
    • Changing the recovery model: Simple Recovery model definitely helps in controlling and shrinking the log file size. Based on how crucial the data is, user can choose any of the following recovery models namely,
      1. Simple Recovery Model
      2. Bulk-logged Recovery Model
      3. Full Recovery Model

    In the simple model, the most recent backup of the database of SQL server is recovered while in the bulk-logged or full recovery model, database can be recovered up to the point of failure. This recovery is done by restoring transaction log file.

    By default, Full recovery model is set. Then, user has to regularly back up the transaction log files to prevent them from becoming too large and to remove inactive transactions from transaction log. Also, consider this when taking back up of .ldf files.

    NOTE: If user is defragmenting indexes, make use of DBCC INDEXDEFRAG and not DBCC DBREINDEX. If DBCC DBREINDEX is used then, transaction log file might expand drastically.

    Using manual solution:

    If maintenance is not carried on regularly, log file size grows too big. Therefore, it is recommended to take these manual steps before it takes up all available disk space. Now, let us look at the method to shrink a SQL Database’s transaction log file:

    1. Firstly, Open the SQL Server Management Studio and then log in to the proper SQL instance.
    2. In Object Explorer tree, expand the Database folder >> select database with large .ldf file.
    3. After this, Create a full backup of database by right-clicking on it >> Select Tasks >> Back Up.
      • User should make sure that Backup type is set to Full then, Delete any existing destinations >> add a new Disk destination.
      • Browse a location with a lot of free disk space >> rename the backup file with the .BAK extension.
      • Choose the bullet option for Overwrite all existing backup sets on the ‘Options’ page.
      • Finally, user can click on the ‘OK’ button to start the process of taking backup of log files.
    4. Similarly, create a transaction log backup of the database in the same manner as done above.
      • Right-click on database >> Select Tasks >> Backup and assure that backup type is set to Transaction log.
      • Choose the bullet option for Overwrite all existing backup sets on the ‘Options’ page.
      • Finally, user can click on the ‘OK’ button to start the process of taking backup of log files.
    5. The closing step is the shrinking of transaction log files by right-clicking database >> Tasks >> Shrink >> Files

    NOTE: User may repeat steps 3,4 and 5 until the .ldf file size becomes physically smaller.


    In this content, we have discussed various solutions on how to truncate .ldf files if transaction file size becomes too large. This is necessary in order to manage data with ease. Manual method and some general solutions have been written in this content to enlighten users when they come across such issues while dealing with SQL Server.

    Judging Microsoft Imagine Cup – 2017

    Judging Microsoft Imagine Cup (1st Round) – 2017 in Bangladesh
    Venue: Microsoft Office, Dhaka, Bangladesh
    Date: 19 March, 2017