Western Alliance 2005 Special Session
FULLDOME STANDARDS DISCUSSION

Updated: 19 September 2005

I am appending some of the notes Kevin Scott took at the Western Alliance 2005 conference, during a Saturday-morning session devoted to fulldome video standards. Many things were discussed, around a very large table that made it difficult to see and hear everybody. Please interpret the notes accordingly.

Also feel free to look at a a PDF of the draft specification as revised in Denver.

And thanks, Kevin, for taking all this down!

Enjoy!

Ryan, a.k.a.
Ryan Wyatt, Science Visualizer
Rose Center for Earth & Space
American Museum of Natural History
79th Street & Central Park West
New York, NY 10024
212.313.7903 vox
212.313.7868 fax


FULLDOME STATE-OF-THE-MEDIUM DISCUSSION
Western Alliance 2005, Denver, Colorado
10 September 2005

Dan Neafus (DMNS):
Introduction

Ryan Wyatt:
Talk about draft standard for dome master format: it was discussed in some detail at DomeFest in Albuquerque, New Mexico. The DomeFest group also engaged in a broader discussion of how IPS fits in, with most of the discussion centered on how the community can share ideas on how to produce content (e.g., budget, staffing, etc.).

I’d like to start by discussing what people see as the primary issues that planetarium comunity is confronting with fulldome video technology. How can the community address that constructively?

Steve Dosremidios (Chabot):
Is it useful to create a dichotomy between real-time and playback?

Ryan Wyatt:
The dome master is, in Ed’s words, the “low hanging fruit” and easy to pick. Most folks are able to exchange playback content with relative success, so we should codify recipes for success.

Quick show of hands: about half of the room are owners/vendors/users of fulldome video technology technology. Why are others here? (Meaning that in a nice way.)

Bing Quock (Morrison):
Making transition from slide to fulldome video technology. Expect to open in a few years. Newbies — trying to figure out how to reach AMNH production quality (or strive towards it) — to get started, especially in production.

Two perspectives from the administration versus the end users: administration wants the musem to create its own content, but budget doesn’t exist. Want production to be simple enough so they can produce the shows they want. Anyone could put together a slide show. Now you need people with special skillsets.

Steve Savage (Sky-Skan):
Like having a good special effects person in house. Tools are getting easier to use. Dome master is now seven years old. Here we are trying to standardize it now. Three million frames have been made. (There’s never been three million of anything in the planetarium world.) By default we already have a standard — we want to make it easier to share.

Storage space is biggest concern. How to address storage? Move from lossless to JPG?

Alan Caffey (Morgan Jones Planetarium):
Coming from the perspective of a smaller dome, within the educational community. Try to make sure technology stays accessible to teachers and small facilities.

Steve Savage:
There are two forces working: making it simple, small, and manageable, versus wanting the very best picture for my audience. Try to balance these two forces. Some discussion of technical format details. Worry about production budgets when trying to address quality.

Ryan Wyatt:
The fundamental issue is faciliating exchange Ñ critical for poor planetarians. Real-time/playback should move as seamlessly as possible. Use them constructively so that everyone from portables to AMNH can share content.

Fred Chavez (Central Texas College):
Considering switching to fulldome video technology. Attending to gain insight into what it takes to use fulldome video technology.

Steve Savage:
Two camps here, real-time and prerendered; we can really only address prerendered.

Ryan Wyatt:
Try to keep floor open for the first hour, hour and a half. Later we’ll discuss details.

Steve Savage:
Real-time is really complicated at the moment. Vendors are trying to maintain certain proprietary aspects in order to distinguish between their products. Discussions will distract us from getting work done on dome masters.

Ryan Wyatt:
IPS needs to provide some sort of information resource — to terminology, for example, in order to generate a vocabulary.

For example, in real-time discussions I’ve had lately, it seems obvious that normal mapping is something we want implemented in the systems — but do they support that? Outline points of comonality and supported features. What tools/models will work on the various platforms? In other words, how can we put that information out? Best practices, conversation points?

Mark Petersen:
Exchanging shows is different from exchanging models and pieces of shows. Everyone plays back shows? Bits and pieces for real-time is down the road. Focus on playback only.

Ryan Wyatt:
As Steve Savage noted, the playback challenge has been around for many years. The real-time challenge is now emerging. Dan made a good point: this is the time to write these sorts of issues into grants.

Mickey Shmidt (USAF):
Still interested in reopening academy planetarium. Eventually want to produce fulldome video technology. Wil probably not be available en masse — but may distribute segments. Here to learn. One specific goal to learn about dome masters. Wants to make sure that when he produces something that it’s usable.

Staffan Klashad (SCISS):
I agree should probably focus on playback. Real-time challenges have already been faced by other communities (e.g., gaming) and it’s probably just a matter of learning from those communities.

Matt (DMNS):
What are the various considerations for dome masters?

Ryan Wyatt:
Perhaps it would help to list some of the challenges faced by AMNH in distributing content:

Mark Petersen (Loch Ness):
Audio has nothing to do with the dome master. Let’s agree on the dome master first, then address the other components. so many things we all want to agree on.

Ryan Wyatt:
Section 3 of standards is all about audio.

Steve Savage:
Sky-Skan has volunteered to make all these standard files available. 1k to 8k, audio formats, draft ready soon. Will address laser gamuts, color depths, etc. Make sure our direction works with what’s happening in the real world. Digital Cinema — need to make sure we have some relation to that. Their samples will include metafiles. They will include sync tests, etc., as well as 40-min sync tests.

Dan Neafus:
Need to remind about international considerations. Language and such. Need to remember the rest of the world.

Ryan Wyatt:
Take one last opportunity to capture anything that should be said to the folks at IPS — i.e., this is what we need to do to support people’s transition to the new medium.

Per (SCISS):
How they are transported from one site to the next. How do you deal with 75k frames at 3k each. Platforms.

Unknown:
Test clips are used to normalize theaters?

Steve Savage:
Everyone should be able to do black and five (test of low end of the system). Black and five helps you tell if you’re able to properly display things like the milkyway.

Charlie Morrow:
Easy to build real-time audio converters. Need to survey and then come up with standard application/platform. Suggest study group for audio.


[Moving to standards for dome masters.]

Ryan Wyatt:
Everyone should have paper copy of Ed’s spec from DomeFest. We’ll do a realtime edit. We have a quasi-official committee within IPS. Point we’re at now is to pick the low hanging fruit and refine it.

Talk through the document. (Fulldome Master Show File — IPS Standard Adoption v0.2 — 30 August 2005)

Want to process metadata and spit out perfect imagery for the dome. Ideal world.

[Overview of document]

No general comments — let’s forge ahead. Keep in mind that there are many ways to execute a standard. IPS approved software for example. Or best practices and recommendations that help facilitate transfer.

[Metadata specification]

Skeletal structure of data to be filled in. Recognize data in XML file — maybe it’s initially a text file.

Mark Petersen:
Could process text and create XML down the road.

Bruce Thatcher:
Any browser can read XML.

Staffan Klashed:
XML is the easier option.

Matt (DMNS):
How many folks out there are going to have the software capability to process these files?

Mark Petersen:
Have both readme.txt and readme.xml.

Ryan Wyatt:
Seems like the best option is to define text and XML formats and provide software able to translate between the two.

Steve Savage:
Add copyright statement to Section 1.2 — show name, etc.

Bruce Thatcher:
Will need to specific rights issues, circumstances of use. Can be clearly stated in XML structure. These are the rights that are available or not available; here’s how to acquire rights, etc.

Steve Savage:
Statement should be made — that statement can refer to another document — a licence agreement. Link to ther place.

Carolyn Collins Petersen (Loch Ness):
Copyrights don’t necessarily alwasy encapsulate all the varius sources and rights.

Mark Petersen:
Assumes dome master frames. What if you’re distributing MPEGs (e.g., number of frames are irrelevant for MPEG distribution)?

Unknown:
Needs to be a hierarchy of format, generalize to number of frames (dome master files or encoded video file).

Anthony Braun (AMNH):
Should allow for frames acoss many drives — do we need to distinguish how those frames are distributed on various media?

Bruce Thatcher:
Reformat every time for every distribution.

Unknown:
Media structure needs to be added. Metadata needs to refer to the entire document and not the media.

Unknown:
Theater specific media structure.

Anthony Braun:
For example, 0 versus 1 in frame numbering: start on 0 or start on 1? Where do we start? And counting frames. Video wants 1; geeks want 0.

Steve Savage:
This is the time to fix that.

Ryan Wyatt:
What is the expressed preference? First frame definition can also include frame names/file names.

Steve Savage:
Naming scheme is all over the place.

Unknown:
Some folks have very rigid file name structures.

Anthony Braun:
Do we just define how “we” make our file names — or do we go down the path of recommending conventions?

Unknown:
Metadata defines file name pattern.

Carolyn Collins Petersen:
Some users don’t understand the patterns

Staffan Klashed:
Metadata and media management are separate.

Alan Caffey:
Newbies are going to be scared to death of all this potentially. High level document separate from the details?

[Extensive discussion on complexity of document/users.]

Ryan Wyatt:
FITS files, for example, contain more metadata than you need.

Benjamin Mendelsohn (West Valley Community College):
How am I gonna get this stuff — on hard drives or over the net? How do I deal with platforms/directory structures, can I even mount the drive?

Jodi Schoemer (DMNS):
File needs to explain itself and what it is — what it contains.

Ryan Wyatt:
Readme for the readme file.

[Recap of metadata in Section 1.]

Will metadata be present on each drive? Same metadata or different?

Should be same data on each drive. Each docuemnt should describe all data across all drives.

Steve Savage:
Drives failing is a major problem — these documents can help this problem by facilitating quick communication of what’s missing.

When the manufacturers try to use this data in their processing programs, they may come back and want to fiddle with this.

This will have to be passed around the manufacturers — so that they can comply.

Ryan Wyatt:
Is this a reasonable request of the manufacturers that they should provide a small tool that would query the producer to fill in all this data?

Unknown:
Request for comments?

Steve Savage:
Manufacturers that want to play won’t have much problem with this.

Need to define how each codec was used (JPEG, 24bit, 100% quality, etc.)

Ryan Wyatt:
Strike #5 and integrate into #7 (still in setion I.)

Bruce Thatcher:
Specify dome tilt that it was designed for — tilt and orientation — seating configuration and orientation.

Dan Neafus:
May need to refer to other documents that describe a style of theater.

Unknown:
Can a customer send their XML spec to a vendor and get back appropriate content?

Jodi Schoemer:
DMNS tested a variety of content, and a 15 degree tilt works for just about everything.

Steve Savage:
Great circle reference line needs to be in the dome master. 15 degrees could be suggested as ideal — studies have shown...

Jodi Schoemer:
Denver not a cap centered dome. Where is the center of interest? The show sweet spot?

Steve Savage:
Sweet spot could be shown in the dome master (grid/guide).

[Discussion of number of “audio tracks” — # of audio files on the disk.]

Most folks don’t distribute stems in general. Is it allowed — do sites have the facility to use stems properly?

[Rights management discussion.]

Ryan Wyatt:
Back to metadata! What can we list here to describe the basic options we want to have open to us?

Unknown:
Need stem policy? Need to be able to describe the tracks. Sometimes sub is derived from program. Finished formats and raw formats.

Anthony Braun:
Seems like we’re talking about two things: audio for direct display and audio for remix. Subs are a specific issues. Derived or specifc.

Bruce Thatcher:
Need track specifications structure as subset of the audio, include audio file naming here, in the initial Section 1 about metadata.

Anthony Braun:
Where does one describe what each track is?

Steve Savage:
What content should come out of which speaker?

Bruce Thatcher:
How to relate audio tracks to speakers?

[Discussion of Audio System parameters and what it means. Where are the speakers in the dome?]

Steve Savage:
Collection will include channel check. Dome original should indicate where that speaker was originally located.

Anthony Braun:
Technical infrastructure is non-standardized.

Ryan Wyatt:
Recap — recipient would get a collection of files and how they were produced. Then — perhaps in the future — the metadata would describe how to reproduce a sonic environment.

[Notes taker is quite confused at this point.]

Anthony Braun:
We know the audio issues are quite complex — and yet we have fewer metadata items for audio than video. As a practical techincal issue — specific number of tracks will be recieved — does this document need to specify what the end user does with them?

Ryan Wyatt:
Audio dome management?

Unknown:
Need to define virtual placement of sound. Producers define where sounds are designed to come from.

Unknown:
Coordinate based naming structure instead of L-C-R-LS-RS?

Steve Savage:
Where is south — 180 or 0 degrees?

[Topic dropped — complexity.]

Anthony Braun:
Virtual placement is one issue — intended use for the tracks (playback/mixing) is another.

Unknown:
Virtual coordinates would be used as a map.

Anthony Braun:
Producer has questions as to how to provide that information

Ryan Wyatt:
We want to make the XML sufficiently general such that we might be able to accomodate future virtual mixing.

[Editing of audio items in Section 1.]

Anthony Braun:
Standard designations — stereo, 5.1, and then openness for complexity

Ryan Wyatt:
We provide the slots in the XML file for others to populate. Trying to make this as plug-n-play. Try to accomodate basic pass-through of audio, and virtual mixing.

Dan Neafus:
Would like this to be a precursor to gaming/interactivity. RT sound movement — versus baking the sound into a set number of tracks that are designed to emanate from the dome.

[Discussion of complexities.]

Ryan Wyatt:
Presumably sounds have been created to match visuals. We want to encapsulate data that helps these sounds move from one dome to another.

What are stem files?

Unknown:
Stem files allow an audio guy to remix audio.

Ryan Wyatt:
Are stems the best way to distribute?

[Note-taker missed quite a bit from a guy down at the end of the table...]

Ryan Wyatt:
Real-time vs. playback.

Bruce Thatcher:
Standard needs to be scalable.

Anthony Braun:
Stem mix is a loose analogy to a collection of layered visuals — the rawer elements. Could then be used for virtual placement of sound — or you could use it to simply remix for a pleasing sound in your dome.

Ryan Wyatt:
[More editing of document with respect to audio.]

Anthony Braun:
Include timecode format — LTC, VITC, non-drop, drop, etc.

Include “Other”


Moving on to Section 2 — Dome Master Spec

Steve Savage:
[Showing sample images.]

Should be a standard sync test. Something the weekend operators run to test the system. A tech crew test.

Developed over a dozen tech loops.

Color depth check — more complex tests.

White frame — two seconds of white with a pip for sync testing — MPAA standard

Need proof (for producers) that a show is in sync. And we really need 40 minutes of sync to show that system works. Perhaps use an end pip?

Two seconds of black on front and back — dont’ want to start on first frame. Need some amount of lead-in.

[Discussion of pips, detecting pips, and audiences.]


Break — with wrap at 1:30pm

Ryan Wyatt:
We have now arrived at the dome master spec itself!

Talking about either a standard or best practice recommendation. Finally getting to be more than just descriptive elements.

First off is render format — “fisheye,” etc.

Ka Chun Yu:
Resolution suffers when rendering a full sphere. Language of full sphere should be left out.

Ryan Wyatt:
Ed was placing logical upper limit on dome master.

Steve Savage:
This is a hemispherical spec?!

Ryan Wyatt:
Don’t need to leave out the info.

Mark Matthews (Visual Acuity):
[Referring to previously displayed dome grid.] There are already standards in display systems that have existed for years in the simulation industry. Rather than creating a different standard for planetariums it would be better to adopt standards that already exist rather than creating new ones. This would mean describing the field of view of both a screen and a dome master in terms of Vertical and Horizontal fields of view. In this format a horizontal planetarium screen would be defined as 180 degrees vertically and 180 degrees Horizontally — Not as 360 degrees as some people currently describe it. When a tilt is applied to the screen and the front dips below the horizontal viewing plane (from either dome centre or a viewers position) then the 180 degree vertical field of view is described as -10/+170. I can explain more if necessary (as could people from the simulation sides of E&S, SEOS, Spitz, Astro-Tech, SGI etc.) This nomenclature is used worldwide and can describe any shape of screen. If we were always talking about 180 degree screens with a tilt there would be little complication, In fact we are mostly dealing with domes that have a field of view of less than 180 degrees and are sometimes truncated in more than one axis.

The same principle should apply to audio and other elements of the Fulldome standards process. Audio and other elements have standards already in use and it would be best to adopt and adapt these wherever possible. This will enable people who join the planetarium club from video or audio backgrounds to better understand the terminologies and reduce the scope for confusion.

Ryan Wyatt:
Less than 180 degree domes are quite common. working with “planetarium” environment

Mark Matthews:
Masters of more than 180 degree are more portable and more useful. If it’s there and you don’t use it — fine.

If the dome master has as large a field of view as possible, this enables the render of ’show masters’ to suit the tilt of the theatre it is being shipped to. This means that the dome master is not necessarily the item being shipped but that shows can be rendered from it with the correct field of view for the destination facility so that it can accommodate both horizontal and tilted domes. It is also important to bear in mind that some companies supply their shows in a pre-divided format where each disc is one display channel’s section of the fulldome image. In some cases these have distortion and digital blending within the render. Does this affect the metafile data? Maybe this is no longer within the realms of ’Dome master’?

Anthony Braun:
Differentiate designation for image structure an facility. Uniformity, not difference

Mark Petersen:
Section 2, Item 1, as worded is okay.

Ryan Wyatt:
Moving on... Some discussion as to how exhaustive we should be in enumerating file formats. Should there be an IPS recommendation along these lines? Advocating that all vendor systems should support the various file formats.

Mark Petersen:
This is again dome masters and not muxed MPEGs. This is for show distribution? Again — some folks distribute MPEGs.

Ryan Wyatt:
Expands discussion from just sequential frames to video formats.

Matt (DMNS):
Most display systems are more “lossy” than JPG compression?

Dan Neafus:
Phrase and recommend best practice.

Mark Petersen:
Recognize various formats.

Steve Savage:
Define the top and bottom of the heap?

Ryan Wyatt:
[Edits doc to reflect lossless and and low-loss file formats/files.]

[Ed. note. where do bit depths come in?]

[Discussion of levels of lossy-ness.]

Bruce Thatcher:
Wouldn’t really recommend lossy files at all.

[Discussion of encoded movie files — encoders.]

[Discussion of bit depths.]

Steve Savage:
Video cards support 10 bit.

JPEG-2000 is only current playback medium to support higher bit depths.

[Discussion of resolution.]

Kevin Scott (E&S):
Best practices in production.

Dan Neafus:
Black hole document did talk about star diameters. Also covered some in-field research.

[Discussion of minimum highest resolution.]

Diameter of dome master is different from dome master frame size.

Ryan Wyatt:
[Edits document for various resolutions.]

Steve Savage:
Sky-Skan will provide a collection of standard frames and test patterns/video to the community.

[Discussion of safe area/sweet spot — unidirectional.]

[Discussion of nominal camera tilt — great circle, horizon/sky.]

[Discussion of including frame number and timecode.]

Mark Petersen:
Move both to upper left — timecode on top, then frame number. Some systems don’t mask their lenses enough — some distributed MPEG files with labels in the corners would get projected.

Bruce Thatcher:
Upper RH corner reserved for production data (producer use).

Steve Savage:
Add handles and pips. 2-3 seconds on front (60 frames), 3 seconds at end, with a pip 1 second before the end of the picture.

Anthony Braun:
We use 5 seconds. Is pip defined as being within the handle?

Steve Savage:
Don’t want audience to see/hear pip. Really won’t modify existing 3million+ frames to be compliant.

Mark Petersen:
Let distributors define pips?

Matt (DMNS):
Are pips useful for daily use or only on installation?

Steve Savage: — do we need an installation section or quality control section of the document?

Ryan Wyatt:
[Edits doc for handles and pips.]


Section 3 — Audio Files

Ryan Wyatt:
[Introduces audio and Ed’s original notes.]

Steve Savage:
Need to specify timecode start time when providing stem tracks (1hr start .vs 0:00:00).

Mark Petersen:
Recommends some file name changes (along with Steve Savage).

Unknown:
Dialogue track is separate?

Mark Petersen:
Recommend C for “center” and T for “top.”


Section 4 — Standard Tech Trailer

[Quick discussion — see what is left (at 1:30pm) so we can let folks go. Tech trailer will be taken offline with Steve Save, Ryan Wyatt, and Mark Petersen volunteering to work with Ed Lantz on the topic.]


Section 5 — Distribution Medium

Unknown:
Do we want to define file structure?

Ryan Wyatt:
Adds information about top level directories: “Frames” and “Audio,” for example.

Alan Caffey:
DVDs often appropriate for show distribution.

[Another quick discussion of frame rates.]

Bruce Thatcher:
What naming conventions do we want for folders?


[Wrap up.] Ryan Wyatt:
Is everyone present a part of the fulldome mailing list?