We foresee on-device inference leveraging local models to architect the next generation of consumer AI applications. The three main advantages these applications would have are:
1) Access to local context
2) Data privacy and visibility
3) Reducing the cost of compute to near-zero
This provides a complement to the two great risks we believe artificial intelligence could bring to humanity (as a species, and as a culture):
1) Increasing centralization of models, and client-side privacy risks and surveillance
2) An over-reliance on quick, universal “truth,” cognitive debt, and the corruption of individual self-actualization
We propose a solution that brings the oft-stated benefits of artificial intelligence — enhanced learning and leisure — securely to the everyday consumer, through a familiar format humans have utilized for centuries: life- and knowledge-logging, or, more informally, journaling.
Further solutions propose development towards a personal ecosystem of thought, and a decentralized knowledge ecosystem and exchange we nominally refer to as “context containers.”