Using WaveFederationProtocol for Augmented Reality and Geolocation applications.


I believe we need a server<>server standard for the exchange of geolocated data between individuals or groups. Much of the discussion and developments in AR so far focused on platforms for a few publishers to distribute geolocated data to the masses.
Little consideration has been given for an AR equivalent of e-mail or for the social exchange of information between select groups.
It is my belief that Wave Federation Protocol (WFP) makes a good foundation for this task.
It would provide a way for the private, public and social exchange of data, without relying on a single company's server.
While the nature of a server protocol is somewhat orthogonal to the data format itself being displayed, both have to be developed together as co-operating standards in order for a desirable end user experience. Both public and private data should be visible in the user's field of view simultaneously. So ideally, the data should be delivered to the AR-browsers/clients by the same protocol.

Below I'll outline the main advantages of WFP and the principles and steps for using WFP to geolocate data.

Why use Wave to geolocate content?

There's a quite a few reasons, but here's the main;

1. Wave is a federated decentralized system. It allows anyone to share content between just a few people *without* all them having to depend on the same 3rd-party server. Like e-mail, Wave allows numerous different users to communicate all on independent servers and still be assured that only those people invited see shared data.
Also, much like OpenID, a Wave-user will only need to sign in once to access secured content shared with them, despite that this content could be hosted on many independent WFP servers.
Without an open server<>server standard such as WFP, there is a danger of a single dominating company emerging to do this task, such as we have seen with social networking sites on the web today.

2. Wave is a system that aggregates content into a list of streams of information for the user. Traditional web demands browsing, but for phones or future HMD systems, this system of constantly switching and loading pages becomes impractical. Wave by comparison would let clients automatically download nearby data from the Waves the user has subscribed too. 

3. Wave allows the real-time moving and updating of content. A 3D-object could be moved in one client (if they have permission), and all the other clients subscribed to the wave would see the 3d-object move in real-time.
Again, this happens regardless of the servers the other clients are connected too. As long as the servers are part of the federation, the changes will propagate to all the other servers in real-time.

4. Scalable. Because anyone can make a Wave-server and join the federation, the system can grow in proportion to its users. As demands for AR go up, with more advanced HMDs and more constant connections, having a system that doesn't require the use of a few central servers is going to be critical to keep the user-experience as smooth as possible.

Basic Principles of using Wave for AR.
A link between a 3d-object           A collection of these                      The user's field of view,
and a real world location                   links forms a layer                     consisting of  many 
or image is specified.                                                                                             layers                                                                                                           


 (Stored as A Blip - A single
unit of data created by one or
more users)                                       (A Wave - a collection of blips)                  (All the users subscribed waves)

1. Each blip forms a "physical hyperlink". A link between real and virtual data.

This link consists of all the information needed to the position arbitrary data either in a fixed real-world co-ordinate system, or in a co-ordinate system relative to a trackable image/marker.
For more details, see next section.

The data itself can be as simple as text, inlined into the blip, or remotely linked content such as 3D-meshes, sound or other constructs. (this content could be hosted locally or elsewhere, and downloaded via standard http)
The principle of using WFP for AR-data exchange is neutral to the type of data you are linking to; standards would have to emerge for precisely what 3d-objects, or 3d-markup, is renderable by the end-clients.

2.  A wave is a collection of blips, in AR this would represent a single layer over the user's field of view.

Standard Wave-server functions allow the subscribing, creating, and sharing of waves with others. Each Wave can have one or more blips created by one or more people.
By using AR within Wave, it would allow the end-users the same freedom to create their own content, and collaboratively edit it with friends. 
Waves can also have bots added to them that are free to manipulate the Blip data. This allows interactive and game-functionality.
No extra protocol work is needed on this level, as this is all native WFP functionality.

3. The end-client would render all the users' Waves as layers in their field of view. Giving them a personal aggregation of public and private content.

Specific example of key/value pairs that could be stored to position AR content.

The key/value pairs stored would be the required information to allow a client to position any data at a real world location.
The data itself could be anything; ARWave is a proposal for how to exchange the positioning information; not a specification for what that data being positioned is. Various formats for that would have to be agreed and standardised separately.

Also, this is a preliminary list only shown here as an example. The precise key names and value formats should be agreed upon and standardised. In this way, anyone could create a client and be guaranteed of compatibility.
This list should also not be seen as a complete list of what's needed; other key/value pairs might be needed in future.

These key/value pairs would be stored as 'Annotations' in the blip specification. Annotations basically allow any arbitrary collection of key/value pairs to be stored.

Key Value Description

'double' type numerical values specifying Longitude, Latitude and Altitude to position the data. Alternatively, a single number+offset could be used if this proves more practical.

Roll degrees of rotation around the front to back (z) axis. (Lean left or right.)
Pitch degrees of rotation around the left to right (x) axis. (tilt up or down, aka elevation)
Yaw degrees of rotation around the vertical (y) axis. (relative to magnetic north, aka bearing)
Data Reference Link Instead of a fixed position specified by the above, data can be positioned by an image specified here. The image orientation determines its position and rotation in space.
If both a numerical position and a image-link is specified, then the position is considered as a offset to the tracked image.
Co-ordinate System 
A string specifying the co-ordinate standard used for the above.

Data MIMEType
The MIME type of the data linked too. (the data is not necessary  3d-mesh, but could be sound, text, markup etc)
Data / Data URI
This link could be a normal static IP-hosted http-server but could also be a IP & port-number pointing to temporarily hosted data on a client.

The last time the linked data was updated.
Metadata could be a single string field of descriptive tags, or (more usefully) a separate set of key/value pairs with a common starting pattern to help form a more detailed semantic description of the object linked too.

Many of the k/v pairs on this list would also be optional, depending on the situation. A piece of inline text content, which is stored inline in the blips content field, would not need a http-link to its data, for example. 

Example schematic of how a WFP AR Client could work;

Essentially a Wave-api would return, and keep updated, a set of AR blip data from the user's waves. The client would then download any other needed data, and render the results in the user's field of view.

Additional Resources

"Everything Everywhere" by Thomas Wrobel -

ARWave organisation homepage (including basic demo video) -