The Safari on the iPhone is built on Siri and part of the graphic old underpinnings for the MacOs Safari. Siri is not a PA. It is an AI verbal interface with a db interpreter. The coming versions of Safari and Siri for all devices will be further integrated, with new a new graphic shell for Safari, using the same expanded db interpreter inclusive of graphic recognitions as part of the new version of Siri. The legacy underpinnings of Safari will be gone. Both may look the same to the user, but the move toward microcoding for addressing machine language means an exponential increase in both speed and power. This also means both can function internally to a device, no longer necessitating a net connection, meaning the combined code can function as a base for apps built on Swift, increasing versatility of app development. Why reinvent the wheel? Graphic shells will matter more than code for accessing functionality, with containers from libraries for predefined functions attached to graphics in series. This is taking interpreted languages to the extreme for quick and easy development. For developers seeking previously unavailable complex functions, they need only write once to new containers, adding to the libraries, which will also be capable of locks for specific developer use. Hereon, scalability takes on a whole new meaning. Activated interfacing with Apple's iCloud storage creates inestimable power. For the health care industry alone, this is revolutionary. As a basis for other human endeavors, only the imagination of mankind becomes the limiting factor, only to be expanded by AI.
You're now in the 21st century. AI is producing code. Microcoding changes the field of play completely. All your limiting preconceptions are history. Art follows life, life follows art.
Here's a little kicker for you. Researchers are already building prosthetic eyes based on Siri microcode, built to interface directly with the human brain. This is the holy grail for all prosthetics.