@MonicaSLam blogged about the First Open Virtual Assistant Workshop:
https://almond.stanford.edu/blog/10-the-first-open-virtual-assistant-workshop
Please reply here if you have any comments or suggestions!
@MonicaSLam blogged about the First Open Virtual Assistant Workshop:
https://almond.stanford.edu/blog/10-the-first-open-virtual-assistant-workshop
Please reply here if you have any comments or suggestions!
In the talks at the 1st OVAL workshop both Monica and Giovanni raised the issue of data sharing between virtual assistants and the way that fine grain control increases acceptance. However, they also highlighted the problem of enforceability (the library card problem). I think evolutionary theory can help with the question of enforceability and in fact much more. I think it may be possible to incorporate cooperative processes within the Almond policy database that will enable sophisticated data sharing protocols between individuals using virtual assistants.
These could include:
I’m not a coder, but would be very interested to be part of a conversation about how to use measures of trust to improve the way a virtual assistant works.
Hi Kit, welcome to the community!
The goal of the access control work was to ensure some notion of formal security, which should work without trust. If there is trust between users, then absolutely I agree that more interesting interactions are possible.
We have not really thought about that, and to be fair have not worked on the access control and distributed virtual assistant for a while now, as we focused on NLP and end-user programmability, but that’s still something we’re interested on, and could be a great shared project. Do you perhaps have a student who would be interested in building this?
Hi Giovanni, thanks for your reply. What a great idea to try and get a student to do this! I will contact the Computer Science Department here at Bristol and see whether I can interest anyone. I’ll let you know how I get on. Best, Kit