My talk would walk the audience through building an AI-powered web app using Python and Flask running on Microsoft Azure. This web app analyses faces using AI and stores information about the emotion of faces in the image, notifying the user if it detects sad faces multiple times. The web app also serves up a simple HTML page showing the data captured.
How would this work?
Once a picture has been successfully taken, a Web API - built in Python - will receive the picture, and analyze it using the Azure Face API (an AI service that can recognize faces in images, as well as estimating the age of the face, if the person is smiling amongst other things) from
Azure Cognitive Services.
The Web API will use this cognitive service to detect the emotion of all the faces, which will in turn be saved into a database called CosmosDB - a document database. These documents contain key/value pairs of data stored in a format called JSON. The API would also return a count of emotions, which the python app would use in asking wether the user is 'Okay' when the number of sad faces is greater than or equal to 3. This is a simple illustration around how this technology could be used as a self care app.
This is certainly going to be a fun learning experience :)
PS: It is pertinent to note that this is NOT a workshop, just an insight into what's possible with Azure's Face API. A proper workshop on this subject topic would take 3+ hours, which we - obviously - don't have.
This talk is an insight into what's possible with Azure's Face API (a member of the Azure Cognitive Services family). A workshop on this project is publicly-available, here:
https://github.com/microsoft/hackwithazure