Sound Microphone Actions After Zap

Hello All, has anyone been able to use the microphone to make actions happen after a zap has been initiated. I am wanting to have the microphone input trigger an action after the zap has loaded the AR content.

Any suggestions would greatly be appreciated. Thank you in advance!!

Hi @mecostanza,

The ZapWorks tools don’t currently support speech input so this functionality wouldn’t be possible unfortunately.

If you could share some more details on your project I may be able to provide an alternative solution.

All the best,
Seb

Hello,
Thank you for your reply.

The app we created is based on another system, whereas the user is able to say yes or no and the microphone uses the microphone general input class to cause action. We like your system and are trying to migrate from ARKIt to your system as it seems much easier to use.

Thank you very much!

Thanks for the quick response @mecostanza.

Would a normal touch input system not be suitable in this case, or does your implementation rely on speech input?

Seb

Hello,

Today, while updating my Zappar app on my Android phone, Google displayed this : “this app now asks for a new permission : the microphone. Do you accept ?”

So I guess there is a way to use it in Studio ?

Thanks.

Hello Seb,

Thank you for your response. Unfortunately touch will not work. We could get away with any microphone input if we were able to use the microphone class and have it randomize the actions.

I appreciate you help and reply!

Hello Alain1,

Thank you for your input. I will check the update and see if it will allow the microphone class.

Thank you again!

Hi all,

Good spot @alain1 :slight_smile: , the recent app update does indeed ask for microphone permission, however this isn’t linked to speech input or voice recognition.

The update is actually in preparation for a new video recording feature for the app (which we hope you’ll all find exciting), though at this early stage we don’t have any further information to share.

We’ll keep everyone updated on this feature’s progress so make sure to check back on the forum regularly!

All the best,
Seb

1 Like

Nice, thank you Seb !

As a feature request, I suggest this : use the microphone to listen to the sound produced by a tv, and decode the deep-link that it contains.
Said differently : you watch an add on tv, it asks to open the Zappar app and listen to the audio. It automatically opens the experience that shows the product.

Cheers,
Alain

1 Like

Thank you for researching this for me. We will see what we can do with touch input until this asset becomes available. Have a great day.

That would be a great step for the AR experience.