I did not include XIaoice (Xiaobing) because AFAIK there was no 3rd party integration or ability to use it in a custom product - it was a self-contained bot, not a framework.
Exactly. Xiaoice is a closed source!
Digression,
in general terms, Xiaoice is a “personal assistant” but I’d define it as a “socialbot”, following the Microsoft definition in Xiaoice academic papers, or following Amazon definition (see also interesting Alexa Prize competition academic papers).
The winner in the socialbot space is, de facto, Mitsuku, ( please watch the talk: https://twitter.com/MitsukuChatbot/status/1173489720304820226), a “hand-crafted” bot made by a single developer, during many years, using (now closed source) old & ugly XML-like AIML language .
BTW Steve Worswick’ Mitsuku won again this year the Loebner Prize competition. See few days ago news: https://twitter.com/MitsukuChatbot/status/1173254583214325762
I did include Bot Framework as a proxy for LUIS (or whatever is the new name - Cognitive Services? Azure AI?).
Too generous
OSS is now trendy so many big players release on github client-side SDKs (for cloud-based closed source products) or some proxy tools as you say, so for example in Microsoft case, maybe you have some opensource components with the Bot Framework, but the core NLU engine, LUIS, as other Microsoft “Cognitive Services”, is a fully closed source software
For me that’s is NOT OSS at all!
Almond is open source because you published ALL the code AFAIK!
ChatScipt is open source also (just as an example).
I’m sure you agree on that:)
As for snips, my understanding is that it was meant to be customized for a specific use case and speaker, and only understand those precise words - unlike a more general intent classifier which can, to a limited extent, handle variations in the training set. Maybe that’s no longer the case and it’s closer to larger models?
I didn’t yet used SNIPS NLU. AFAIK developers must use an intent-based classifier (you would call it “classic intent”) not very different from DialogFlow or Alexa " *Skill Interaction Model"…
At run-time, when an intent is matched, SNIPS generates an event published by a MQTT broker. See docs:
https://docs.snips.ai/articles/platform/nlu
https://docs.snips.ai/articles/console/actions/set-intents
Thanks for the suggestion to include ChatScript and nlp.js, I will do so.
And maybe another related project is also:
Of course last mentioned sw are NOT personal assistants in the Almond meaning! These are instead just tools (to possibly made chatbots/voicebots/voice assistants).