The Personal Archive has its origins in the silicon era. The earliest silicon based machines, called "computers", were built to perform mathematical calculations, so provisions for information security were not included in their designs, which were based on the flexible but inherently insecure Turing Machine architecture. Problems were exacerbated by the highly centralised models for information storage and a general lack of concern for privacy of personal information.
Concern for personal privacy spread as Knowledge Technologies, then categorised as Artificial Intelligence, gained in sophistication, and gained in impact as the as the amount of detailed personal information on individuals multiplied. The Privacy Architecture was developed to replace the Turing architecture, and quickly developed into the PA.
Seven fundamental changes have been introduced to produce the PA as we know it now. These were:
1. Because of the high cost of early silicon memory, computers had transient memory which allowed records to be modified. In a PA the memory is indelible. This was a necessary requirement for records to be trusted.
2. The architecture does not provide any physical means of reading an archive other than through a hardware gatekeeper. Access via the gatekeeper is under the sole control of the owner of the PA. This is a necessary condition for privacy.
3. The gatekeeper records all its actions as part of the archive, which is a necessary requirement for reconstruction, analysis and verification of its actions. This underpins both trust and privacy.
4. Rather than the arcane specialised control languages used in computers, a human language is the operational language of the device down to the hardware level. This gives operational transparency and a direct means for the owner to provide instructions and check that they are being interpreted correctly.
5. There is a core set of standard access rules. These provide trusted answers to basic questions such as ‘Who are you?’ along with diagnostic evidence that the answer was derived from the core rules. The core uses an unambiguously defined subset of the natural language in use.
6. Manufacture is completely transparent. To trust a PA we need to know exactly what's going on inside it, or rely on a wide community of users who have checked the system you start with. This has been the most difficult requirement to satisfy since it relies on trusting others with the construction. We've solved it through multiple projects with a diverse range of people constructing the units.
7. Public use of PAs has required the introduction of technical protocols for maintaining the privacy of others.
A notional representation of PA Architectures