Hi, this is Guy Barker here and I’m going to give a quick demo of a pre-release version of the Sa11ytaire app, that’s spelled S-a-11-y-t-a-i-r-e, and this version uses Azure Cognitive Services in two ways One: it will recognize playing cards being shown to the app And then when it recognizes the name of the card, it will actually speak it So it uses the Azure Custom Vision Service and the Azure Text to Speech service. So to give an example of that I’m going to show the three of clubs there to the app. First though I will turn on the feature that has the app look for cards Okay, it now says “Looking” at the top of the app. Here we go (app says) “Three of clubs”. OK and at the top of the app there, it says “Most recent card seen three of clubs” and it spoke it. So it did successfully recognize that card. Now interestingly, because there is no three of clubs shown in the game, nothing happened in the game. So how about if we select the Two of diamonds. So here we go (app says) “Two of diamonds”. There we go. It recognized the two of diamonds spoke its name, and selected it in the app and let’s try and put that on the three of hearts Though first though, I’ll show it the three of diamonds Here we go (app says) “Three of di” oh, well, started saying three of diamonds and because there is no three of diamonds there, nothing happened. I will now show the three of hearts There we go three of hearts (app says) “Three of hearts”. Now it recognized that and realized that two of diamonds can’t be put on the three of hearts So nothing happened. So now I will show it the two of spades (app says) “Two of spades”. And it’s selected it in the app. Let’s see if it can put that on the three of hearts So this is the three of hearts here we go (app says) “Three of hearts”. Yes, and it successfully moved the two of spades to the three of hearts So there we have it. There was… Oh tell you what, since the three of spades is there we’ll try and select it. I’ll move it a bit closer There we go. (app says) “Three of”. Oh, I suspect the truncation there is because the app is inappropriately having the speech say nothing and that is unintentionally truncating some speech in progress, so I suspect that’s an issue with the app there. But in this pre-release version we have the Azure Custom Vision Service being used to recognize these playing cards and it did so pretty well then and then the Azure Text to Speech speaking the name and the voice used is one that I chose out of many available to me. So there we go this will be up at the store hopefully before long. Thank you.