Thursday, November 15, 2012

Is (Motion Picture) Film Dead?

A friend wrote me the following message on July 27, 2012:

Hi,
I somehow found this article and thought of you and some of your recent emails - most especially the one about Grand Prix (this friend and I have a long history around this  1966 John Frankenheimer film, which happens to have been shot on 65mm film - E).
I hope I am not stepping on any toes here, but I have to admit I absolutely hate digitally projected movies. Truth be known though, I never really thought about it very much until a recent set of experiences...  After many years of not setting foot in a movie theater at all (due to a combination of kids, schedule, and a lack of interest in the available fare), last year I went with my family to see “Gnomeo and Juliet,” a wonderful animated film that I enjoyed very much.  I made note of the fact that it was a film projection, but didn't really think about it much.  However, less than a month later, we went to the same theater and saw “Rio,” yet another animated feature, and a good one as well.  BUT, this one was not film, but rather a digital projection. I was struck by the clarity and stability of the image, but was totally put off by the sharp, lifeless, pixellated (albeit small) appearance.  It was like watching a big TV, which is essentially what it is, I suppose.
I understand the flexibility and the efficiency of the system, and I also understand why so many people are fans of it, from a purely technical point of view at least.  But I hate to see the artistically pleasing be sacrificed on the altar of digital efficiency.
Or, in other words, may I quote Jean Pierre Sarti (a character in the movie played by Yves Montand - E), from “Grand Prix”:
"The difference, my Dear, is in the art of it.  We could wade out, and hit the fish over the head, but there would be no art in that, would there?"
Hope you don't mind an unsolicited opinion!

I don't mind your opinion at all. In fact, as I discover when I write what turn out to be lengthy tomes, it's inspiring to answer questions or address comments for different people. I wouldn't have composed this for anyone else quite the same way, though I must admit that after this got to a certain critical mass, I started to write it more generically for consumption by others. But it's still tailored for you.

I'm writing this sentence after having composed quite a lot which follows. And I realize upon re-reading your original mail (as I'm trying to organize many paragraphs into something less rambling) that you've really touched on a lot of ideas I've discovered I wanted to express to you regarding this topic. So I'll now try to use your email to create my topical subheadings, and see if that helps me organize . . .

ALREADY?

We've known that digital projection was coming for several years. Some time in 2009 or so, we saw an article declaring that most theaters would convert to digital projection by 2014. I think it was upon our return from our 2011 summer trip that we were standing in line at our "home" AMC multiplex in Burbank, and I was looking at the marquee (which is now made up of dozens of large HDTVs), and I finally noticed that nearly all the titles were followed by "-D." The only titles that weren't were the IMAX theater and the 3-D titles, both of which I knew were digitally-projected. I asked the ticket clerk, "What does the 'D' at the end of each title mean?," and they responded "It's digital projection." I said, "Are you projecting anything on film?" They shook their head and said, "No." Even though we knew it was eventually coming, it was startling.  

"I ABSOLUTELY HATE DIGITALLY PROJECTED MOVIES"

Until video came along, motion picture film was the way we recorded and displayed the moving visual world. When television debuted, it was incapable of producing the resolution or frame size of film, so we didn't have to compare them in theatrical venues until somewhat recently. So for a century, we've had one collective idea about what "movies" look like.

I'm fascinated that the rudimentary memes of an automobile: four wheels arranged on two axles, a steering wheel, and foot-operated pedals for throttle and braking - were codified within a decade or so, and continue to serve 100 years later in applications as varied as a Formula 1 car and a forklift.

And though there have been many improvements to film, the properties of the film projection experience have remained essentially the same for most of that century. When we watch the earliest silent films, they don't seem that alien, because the infrastructure is basically the same. Is this worldwide system of projecting 35mm film, 24 frames per second (shuttered twice or thrice per frame - more about that later) the best solution for presenting moving images? I don't think we can say that is so. It's been adequate, and served us all well. Did it need to be changed? Maybe not.

"THE DIFFERENCE, MY DEAR, IS THE ART OF IT"

These rudimentary aspects of film have become more than characteristics. Over time, because artists, craftsmen and engineers have sought to improve and understand the nature of the medium, the medium itself has become part of the art. Like the texture of the canvas stretched over a frame under a fine painting, the artifacts of film: its graininess, the blurred renditions of moving subjects, the limitations of the film to represent the lightest and darkest parts of the scene - all become part of the cultural experience that is motion picture film viewing.

I'd certainly say that it was inappropriate to say whether losing some of these characteristics was "better" or "worse." It's a change. If you treasure the way film has looked, you'll be disappointed as it is replaced with image sensors and projectors that no longer attempt to replicate the "look and feel" of film. If you think images should look more like reality, maybe you'll be pleased by the change. People stopped making daguerreotypes as a method of general-purpose photography in favor of newer technologies. But the images those newer imaging technologies produced weren't the same, nor were they themselves perfect renditions of the photographed scene.

More recently, we in the U.S. transitioned from 6-decade old NTSC television to various new formats within the ATSC standard. There are no doubt some people who believe that there were qualities of NTSC (qualities which I would define as compromises or otherwise unwanted artifacts) which were a part of watching television. I will admit that I can feel nostalgic about analog static, ghostly artifacts of picture signals bouncing off an overhead aircraft and rolling pictures of aging televisions, but I don't really want that as part of my standard. I'm not making an exact parallel to film, as some of the unique qualities of film aren't so obviously negative, but I hope you get the idea.

So I think we're talking about something of that nature - a transition. It's a change imposed primarily by commerce. Its effects will be recognized by a small percentage of the consumer audience, and a large percentage of the cinematographers who spend their lives thinking about these subtleties.

So the characteristics of what we know as motion-picture film are an "acquired taste." Those people of our age who have a cultivated perception and can recognize the difference, will find even small differences in frame rate and the associated frame blur quite noticeable. For me, even profound. From the first day I met my wife in January of 1981, we started a discussion of video vs. film. She had chosen to aggressively champion video, and my interest had been in film. Because I've been observant about the topic for 31 years, I've gone through a lot of introspection about why I perceive them differently.

WHAT'S DIFFERENT WITH DIGITAL PROJECTION?

With today's "digital cinema" projectors, here are characteristics that I think are noticeable:

  • No frame float - When typical projectors are running, the film's intermittent motion allows for some inaccuracy in positioning (as can the film camera during photography). As a result, the projected image constantly wanders about. On large screens, sloppy projectors and worn prints, this can be several inches of float - easily noticed at the edges of the frame. Digital projection has NO float. This isn't such a bad thing, but like many other topics here, it's been a part of our film-viewing experience for a century - so it's a big change.
  • Bright - Most of the digital cinema projectors are set up to send more light to the screen than you usually get from the typical film projector chosen for the same venue. This is generally appealing, and hides the fact that the black levels may in fact be somewhat higher (next). This additional available brightness helps somewhat in overcoming the dramatic light loss from 3-D presentations, where both the projector and the audience employ light-attenuating polarizers.
  • Elevated black levels - Just before the movie starts, most digital projectors reveal a pretty brightly-glowing raster, much brighter than the amount of light transmitted through a piece of black film. Because their maximum brightness tends to be higher, this isn't noticeable when the content is playing.
Not so noticeable at this point are differences in contrast ratios between film and digital projection. The latest projectors are claiming contrast ratios of over 2,000:1. If this is real, then it exceeds what 35mm film prints have been claiming (only 150-200:1).

Nor is resolution an issue. While most installed digital cinema projectors are so-called "2K" (2048 x 1024 pixels, not significantly different from the 1920 x 1080 maximum of HDTV), and 35mm film can certainly record more data, 2K projection can look pretty good. Note that 4K projectors (and higher) are available, although many theater chains who have already made the substantial commitment to purchasing very expensive 2K projectors for their entire chain may delay upgrading.

FRAME RATE/FRAME BLUR

Some of what we associate with watching film projection is an unnatural experience. Unnatural because in the real world, things that are moving - which includes our view of the world when we move our heads - isn't made up of sequential still images. So this illusion is a combination of what we call "persistence of vision" and a trainable characteristic that our brains attempt to make sense of these sequential images as being some representation of what we see in the Real World. In other words, we're used to how movies look. Are they a perfect representation of the experience of looking at the real world with our eyes? No. Not even close. At 24 frames per second, and using the default standard of a 180-degree shutter (a semicircular spinning affair which covers the aperture while the film is being yanked to the next exposing position), the motion-picture camera is thus merely a 1/48th of a second still camera. And as anyone knows from shooting still photography, this exposure duration is incapable of "freezing action." Even normal human activity such as walking produces significant frame blur in a 1/48 frame. So our entire history of watching motion pictures also contains this distortion of reality - in some circumstances, massive frame blur. If the camera pans at more than 20-30 degrees per second while following action on a semi-long lens, the background blurs to blurry streaks. In real life, our brains tend to ignore what's going on behind someone if we were following them in the same way, and our eyes can and do "smear" during panning, but not as much as 24fps film.

The frame rate of 24 frames per second was standardized long ago, and chosen out of practical and economic need. Early motion pictures had settled into the mid-teens for frame rate (early cameras and projectors being hand-propelled) as a minimum tolerable level of flicker experienced by the average viewer. When sound-on-film demanded standardization of film speeds, 24fps became a world-wide standard.

It has been common practice for motion-picture film projectors to incorporate shutters with two or three blades which interrupt the projected image twice or three times per frame. Thus, we are actually watching a 48Hz or 72Hz intermittent light source, in which 24 discrete images are presented two or three times each second. This strategy reduces perceived flicker because it reduces the light and dark intervals during which your eye's electro-chemical cells are exposed to, and deprived of light.

Visual-effects man Douglas Trumbull has championed higher frame-rate photography and projection with his Showscan enterprise for over three decades. He claims that in testing, humans perceive differences in the viewing experience from increased frame rate up to the high 100s of frames per second. His Showscan "system" was simply shooting and projecting 35mm and 65mm film at higher rates (typically 48fps and 60fps), and has been utilized primarily in permanent installations and in motion-ride simulations. We've seen a few Showscan rides, and I was privileged to see a remarkable Showscan demo at their offices in the late 1980s.

More recently, James Cameron ("Terminator," "Avatar") and Peter Jackson ("Lord of the Rings") have taken up the gauntlet of higher frame rate. Because existing digital projection systems are already running at 144 frames per second when presenting some 3-D content (the RealD system, which alternately presents each 24fps left- and right-eye image three times), there's existing infrastructure for projecting at higher frame rates. Cameron had hoped to shoot and distribute "Avatar" at 48fps, but was unable to convince the distribution business of this at the time. Peter Jackson has been shooting his new Hobbit movie at 48fps, and intends to distribute at the higher frame rate, so this may be a motion picture business landmark. Whether audiences *like* it remains to be seen. Cameron claims some viewers' complaints of "motion sickness" when watching stereoscopic "3-D" movies can be be eliminated by increasing the frame rates, thereby reducing "edge artifacts" (by which I think he's referring to motion-blur).

When digital artists are creating synthetic scenes and action, it is their practice to procedurally add motion-blur artifacts to the footage when it is rendered, to simulate the artifacts of real-world photography and human vision. If they did not do this, the resulting footage would be unnaturally sharp and stroboscopic. Watch the most action-filled stop-motion animation scenes of a film like "The 7th Voyage of Sinbad" (animation by the legendary Ray Harryhausen) and you'll know what it looks like to have absolutely NO motion blur. For the tauntauns in "The Empire Strikes Back," dragons in "Dragonslayer" and the attacking harpies in "Young Sherlock Holmes," stop-motion animator Phil Tippett used a technique dubbed "Go motion," in which parts of puppets which were to be in motion were mechanically moved during the single-frame exposure to create actual motion blur. So much a part of our viewing experience is this artifact, that these techniques have been painstakingly developed to represent it.

All this having been said, thus far, frame rates remain at the same 24 frames per second as they have for decades. But with the transition to video presentation, it is inevitable that frame rates will not only change, but perhaps vary from production to production.

You may have seen store HDTVs on display in modes demonstrating 120Hz, 240Hz and higher frame rates. These televisions are using custom digital image processors to synthesize intermediate frames between the existing frames of content which originated at 24, 30 and 60 fps. For us - and probably you - the effects are horrifying. The first time I saw this 60 feet across a Costco store, and it was a scene from a motion picture. My jaw dropped, and I walked over to the TVs. The footage looks ultra-crisp (using digital sharpening algorithms), and lacks frame blur. It looks like scenes shot with a strobe light, or at very small shutter openings (see below). I absolutely HATE it. But television manufacturers are trying to sell televisions. They don't care about the artistic intentions of the original filmmakers, and neither will many consumers.

(VARIABLE SHUTTER ANGLES: Film cameras can be equipped with shutters with variable openings, so the exposure time is no longer coupled to frame rate. Likewise, digital image sensor can now independently vary exposure time and frame rate. Short exposure time and the resultant lack of frame blur been applied artistically by some directors and cinematographers to represent heightened awareness, or to simply look dramatic. I'm still not crazy about it after it's been in vogue for several years, but I've unconsciously added it to my visual vocabulary. See the action scenes from director Ridley Scott's "Blackhawk Down" and "Gladiator" for examples.)

Some day, they'll probably shoot and broadcast sporting events at frame rates of hundreds of frames per second. I can see that. You'll finally be able to follow the hockey puck. But I'm really not interested in seeing "Lawrence of Arabia" at 360 frames per second.

CULTURAL ASSOCIATIONS WITH FRAME RATE

In the U.S., because of our 30fps television standard and our 24fps motion-picture film standard, we've learned to associate the frame rate visual artifacts with the mediums.

So cultured are we to the subtle difference between 24 and 30 fps, that when on infrequent occasions a television program, shooting on film at 24 frames per second, footage is sped up so that one film frame appears in every 1/30 second video frame, we respond with the comment, "it looks like video."

Within the television industry, it has long been accepted that 24fps footage has the "class" of film production. Shows traditionally shot on 30fps video included sitcoms and soap operas, while "prime time" narrative dramas were produced on film at 24fps. Even today, as the bulk of all programming is acquired electronically, those cameras are typically configured to shoot at 24fps because of this association with quality.

Despite the fact that our current television standards are still based on 30 and 60 Hz models, new video cameras designed for "digital cinema" and even consumer camcorders offer modes to shoot at 24fps - so strong is our cultural association with this frame rate.

Another slightly bizarre example of our being culturally adapted to this technological artifact is "2:3 pulldown" (or "3:2 pulldown"). This describes one of the semi-complex cadences used to display 24 frame per second films on NTSC video, in which each 1/30 second frame is composed of two 1/60 second fields. By displaying every 4 frames of film (A, B, C, D in this example) multiple times on subsequent fields of video: AA/BB/BC/CD/DD/AA/BB/BC/CD/DD, the film plays in the original amount of time. Thanks to the existing concept of the projector film loop to insulate the soundtrack-on-film from the intermittent stop-start action of the film at the projection gate, sound plays normally as it always did. When I was a projectionist at WFMY-TV, I was unaware that this was happening mechanically in those RCA telecines, and that the distinct chatter of those machines was a result of the 2-3 cadence. Every time you've watched a theatrical motion picture on NTSC television, you've been watching this cadence. If you look at the footage an interlaced frame at a time (both fields displayed at once), you'll see that those BC and CD frames are half of the horizontal scan lines from one film frame interlaced with the next.

(People in 25Hz/50fps television countries simply watched movies 4% faster and with 4% higher pitch audio for most of TV history. Today, both 24fps/30fps/60Hz and 24fps/25fps/50Hz countries benefit from far more complex digitally-produced pulldown cadences.)

FILM GRAIN

Another characteristic we associate with film is grain. In some sense, the visible artifacts of those randomly distributed silver particles could be considered "noise" - an undesirable loss of fidelity. But just as terminology of the effects of one language on the speaking of a second language changes from "accent" to "dialect" over time, I'd argue that film grain has become part of the medium.

So apparent is film grain that when visual effects artists are integrating synthetic imagery or footage shot on digital cameras with principal photography shot on 35mm film, they add "film grain" to the new elements to match. We've seen many demonstrations of increasingly improved electronic cameras developed specifically for motion-picture production, and while they record "filmic" representations of light and are often "tuned" to simulate film/light response curves, I recognized early on that lack of grain (sometimes replaced by sensor noise or digital compression artifacts) was - for me and my wife - disconcerting. I'm not calling it "wrong," but it's definitely not what we've had. Does it matter that it's different from what we've had for 150 years? Well, it matters to me, and no doubt all the cinematographers and photographers who have spent their lives intimately understanding and anticipating the nature of film. Many techniques have been developed in a deconstructionist trend in filmmaking in the past decade which deliberately create more noticeable film grain via abnormal film lab procedures. My friend Picha, a cinematographer, likes to sit in the front rows of a movie theater because he likes the immersion of filling his periphery, and he likes to see the film's grain.

Do electronic imaging devices make pleasing images, without grain? Sure. These days, if you see a photo in a magazine or even a shot in a movie, there's a good chance that no photochemistry was involved.

Interestingly, the two movies you mentioned seeing: "Gnomeo & Juliet" and "Rio," were both likely produced without the benefit of chemical film at any stage. In the case of the former, you saw a film print made from the digitally-rendered production, written to film via a "film recorder." In the latter case, you may have seen a production in which no light was ever involved until it emerged from the digital projector in your theater. So in the first case, some of these "legacy" characteristics I've mentioned: film grain, frame float, film contrast ratio - were imposed on the grainless, non-floating digital file - making it a more familiar experience. In the latter, the New World.

GRAIN REDUCTION VIA FRAME RATE

A consequence of increasing frame rate is a reduction of perceived noise, be it film grain, electronic noise or digital compression artifacts. Even the 25 per cent difference between 24fps and 30fps produces noticeably reduced apparent noise, as it is averaged out over more, randomly differing samples.

If moving away from film to all-electronic production means the loss of grain, an eventual increase in frame rates will produce an even more noiseless image.

IT'S NOT JUST PROJECTION

This electronic/digital sea-change is happening across the entire motion-picture production process:

  • Acquisition/generation - Increasingly, television programs and theatrical motion-pictures are choosing to shoot on digital cameras rather than on 35mm film. Motives include:
    • Some potential cost savings - Though the advantages over film might be somewhat small in the course of a typical $50M production.
    • Ease of workflow in VFX (visual effects)-heavy shows - Where scanning film is time-consuming and expensive
    • Physical compactness of camera equipment - Magazines of film add considerable bulk for confined spaces.
    • No magazine changes - Data from digital cameras can be sent electronically to banks of hard drives, making "reloads" a thing of the past.
    • Not much difference in appearance - For most viewers, the difference between shows shot on film and on 24fps cameras either designed or post-processed to look like film is a wash.
    • SAG/AFTRA - Three or four years ago, in a conversation with a friend about his work at The Screen Actor's Guild (now SAG/AFTRA), we discovered that a half-century agreement between the two acting guilds left the American Federation of Television and Radio Artists to sign performers on productions shot on "video," while SAG retained the more prestigious "film" projects. Flash forward to the late Naughties (2007-8 or so), when observant contract-readers realize that this means that any project shot on video can sign an performer who is a member of either union (SAG and AFTRA honored each others signatories). As SAG had been in constant turmoil and presented a very unstable future to producers, they simply chose to shoot their television projects on digital cameras to avoid SAG entirely. So in one television season, the percentage of film vs. video productions switched from something like 80/20 to 20/80, and no doubt helped seal the fate of film production for TV, as cinematographers and directors came to realize that it wasn't such a big change after all. In predicting the forces which would contribute to the progression from film to electronic acquisition, I wouldn't have considered such influences before this.
  • Editorial - Even the last die-hards like Steven Spielberg, who insist on editing on film, will have to adapt as infrastructure for providing the pieces - like film work-prints from processing labs - vanishes due to lack of demand. I've had some hands-on sense of the progression from film, to tape, to digital nonlinear editing - and I'd certainly never wish to go back. Though it's been painful for some old-school film editors to make the transition (and I'm sure many just retired instead), those that stuck with it appreciate the improvement as well.
  • Finishing - It's still possible (if you're successful, profitable, and eccentric enough) to do a "photochemical finish," where the camera original negative is carefully pulled from a film vault at the end of editing, very cautiously cleaned and spliced, and then optically printed to intermediate "interpositive" stock with shot-by-shot exposure corrections made in this printing pass, from which a small number of "internegatives" are printed to ship to duplication facilities. The release-print positives are mass-produced from these internegs while the camera negative and exposure-corrected interpositive are safely secured back in a vault. But if you've seen a movie in the past several years (and apparently you've seen at least *one*), you've seen a movie that was most likely finished via "digital intermediate," or "D.I." In this modern process, camera negatives are scanned to high-resolution digital files, and potentially never handled again. Editorial and finishing are in the digital realm, until 35mm internegs are created on digital "film recorders," after which the existing network of duplication labs handles printing and shipping in the traditional way.
  • Distribution - You've seen the octagonal film shipping containers in which prints are shipped to theaters. Today, movies open on 4,000 screens on the same day - which is a significant amount of shipping cost. Each print costs a couple of thousand dollars, and wears out a little with each screening. Digital presentations can be distributed on hard drives, optical media, via satellite or Internet. A single "copy" can be played on multiple screens concurrently, and even remotely re-licensed for more screens should demand increase, so a theater could immediately add screens for a runaway hit - impossible with physically-shipped film prints. Further, these digital "prints" are encrypted to (theoretically) inhibit piracy.

WHAT DIFFERENCE DOES IT MAKE?

Maybe it doesn't. Film reigned for a ridiculously long period of time as *the* method for recording visual information. It evolved and got better. Engineers improved how it worked. Artists exploited and explored the possibilities of expression and creation with it. But like dinosaurs, the Romans and buggy whips, maybe its time has passed. I don't *want* that to be true, and film photography may be kept alive longer than expected - like vinyl records. Like Boeing B-52s. (These bombers were deployed 60 years ago with a working life expectancy of perhaps a decade or two, but a changing world and economic practicality has the current program extending the lives of these Cold War aircraft to the year 2040 - the friend to whom this was written knows this. - E) But making chemical film is somewhat complex, and benefits from scale of manufacturing. Yes, there will probably be boutiques making small batches of film, but they probably won't have the predictable characteristics of modern stocks.

Once upon a time not so long ago, Eastman Kodak made all their money selling film to consumers, who snapped billions of photos and printed them on Kodak paper. As what nearly amounted to a vanity product, they researched, developed and supplied film for the motion picture industry. Today, the shrinking motion picture stock business is a primary income source for the once giant, and Kodak digital consumer cameras are merely another maker of lookalike shiny gizmos that now compete with telephones as a primary photographic device.

Those young people who are buying digital cameras, those young people that are in my wife's film-school classes right now, they're already growing up in a world where less and less film is shot. They like what they see, and when they find out the next Hobbit movie and the next giant franchise that they fall in love with is shot on a "Red Epic 5K digital cinema camera," that's what they'll hope to make their movies with. So most of the current generation of content-creators won't see new imaging technologies as "not film." They'll just consider them part of the palette of technologies and choices, until the last few directors make the last few valiant stabs at a film production.

Ironically (or maybe that's not the right term), unlike film stocks, digital imaging technology is a very fast-moving target. By the time those students get to direct a feature film, today's cutting edge cameras (which thus far are built around an imaging sensor) will be tomorrow's junk. If today's cinematographers could buy all the film stocks from the past 80 years, they would. Every day, someone tries to use chemical or computing processes that emulate the characteristics of film stocks of the past.

The point is, things change. Sometimes, they change when you wish they wouldn't. Economic forces and fashion are already pushing film-based production towards its demise.

WHAT'S THE TAKE-HOME?

Photographic intention can range from clinical, where the fidelity of the recording and it's similarity to the original are paramount, to impressionistic, where the recording medium itself is deliberately exploited and even abused for effect.

Changing from chemically-based film recording and presentation to electronic is akin to the invention and evolution of musical synthesizers, except that at the point where synths could transparently replace real acoustic instruments and performers (as well as make sounds beyond the scope of the physical world), acoustic music-making wasn't put aside for all time. I think there's a good chance that film will cease to be produced in our lifetimes.

There will always be the motion pictures of the film past to watch. And as has occurred throughout history, artists will explore and revisit the past, simulating film and all its foibles. They may even make a batch of film, just to do it.

I’m going to miss film. You can hold it up to the light and see the image. When you’re in the dark, loading a film magazine and you’re not sure which way the film is emerging from the mag, you can touch it to your lips - if it’s the emulsion side, it sticks.

And it smells great.

Monday, October 29, 2012

Why Your Next Tablet Should Be an iPad with Cellular Data Support

In case you never knew:

What remains one of the most powerful advantages of the iPad is that Apple negotiated with cellular data providers so that your cellular data plan is month-to-month with no contract, and can be purchased directly on the iPad. So for example, you buy an iPad with “WiFi + Cellular” (the cellular variants will cost $130 more than the WiFi-only models, but read on to see why I recommend doing this), BUT you never activate the cellular service. At some point in the future, you can actually be rolling down the road (hopefully as a passenger, and not the vehicle operator) and realize that you MUST access an Internet service, and all you need is a credit card. You create the account right on the iPad, using the iPad’s cellular radio hardware that you and your iPad have ignored up to that point. Within minutes, you’ll be on the Internet. At the end of the month (or if you manage to use up all the data in the plan you’ve purchased before the month ends), you can choose to let the service lapse, and you pay no more money. Or you can renew or add data as you wish. Plans start at $15/250MB, which is actually a very useful amount for Web browsing. In my case, I’m a very frequent user, but because I’m primarily searching for text-based information, it doesn’t amount to much data. I average under 0.5GB/month, even though we have a grandfathered Unlimited Data plan. Now, if you choose to watch all your Netflix streaming movies on your iPhone/iPad (which you can), that’s different. On those devices, you’ll probably move 1 to 2 gigabytes of data during the 2-hour movie (on an HDTV streaming device, Netflix can automatically adjust up to about 2.3GB/hr).

We’ve stayed at hotels and been in convention centers that charged up to $30(!) for a 24-hour subscription to their WiFi Internet service. For that much money, you can turn on your iPad’s cellular radio-based Internet connection (WiFi + Cellular model, with AT&T, Sprint or Verizon support) any time you like for one month.

If you’re a modest consumer of bandwidth, and NOT into streaming movies and music, you may be able to use your shared Mobile Hotspot on your iPad (or iPhone) as your Internet service for your home or office, connecting your desktop and laptop computers to your mobile device as you would your home’s WiFi access point. For people living in areas not served by DSL or cable broadband providers, this can be a very practical alternative to satellite-based Internet access.


FREE MOBILE HOTSPOT WITH VERIZON

I missed this important detail when it was announced way back in March 2012.

If you get an Apple iPad in one of the “WiFi + Cellular” variants, and select the model that works on the Verizon network, you can use your iPad’s (paid) Internet connection to share with up to five other WiFi devices (computers, tablets, game devices, DVRs, etc.) WITHOUT having to pay an additional fee for “Mobile Hotspot” service. On AT&T iPads (and AT&T and Verizon iPhones), you must pay an additional $20/month for the privilege of being able to activate the Mobile Hotspot (or as Apple calls it, “Personal Hotspot”) functionality.

(AT&T has suggested that it would eventually offer the free Mobile Hotspot feature with their iPad plan, but as of the end of October 2012 that has yet to surface.)

So if you’re in the market to buy an iPad, consider: 1) paying the extra $130 for the “WiFi + Cellular” model, even if you don’t think you’ll need cellular access right away, and 2) buy a Verizon model, unless you know for certain that the cellular service area in which you wish to use your iPad has poor or no Verizon service.

Sunday, September 09, 2012

Wearing Headsets in an Automobile

Ever wonder if wearing a headset or earphones while operating a car are legal in your state or province? AAA has page listing current laws:

http://drivinglaws.aaa.com/laws/headsets/

Wednesday, May 30, 2012

Extract Timestamp Information Inside DV Streams

I'm working on a huge project to archive hundreds of hours (accumulated over 30+ years) of personal videotape from many sources.

Our Digital8 camcorder embeds time and data metadata in the digital stream recorded on tape, which the camera can display. But I'm expecting never to play one of these tapes again (either because the tapes or VTRs will eventually fail), so the only thing I'm migrating forward from here on is digital files.

But how to read the embedded timestamp data within the "DV streams" I'm archiving to hard disk?

After much research and experimentation with various offerings, I've settled on the free "DV Analyzer" from AudioVisual Preservation Solutions.

This utility can process a video file or directory full of video files, and generates a text-based or XML report of errors it finds in the digital file. It also reports the timestamp information for every shot. It's not the friendliest of solutions, but it will serve my purposes well enough.

I'll batch-process my video files with DV Analyzer overnight (it takes several minutes to analyze a 60-minute file), and use the resulting reports while I'm hand-logging the contents of the video files. I'm only interested in noting the times for any given new "event" on the tape, so it's not typically a huge number of events of which to determine the date. Up to now that's been a "manual" process. I've been determining dates of the footage by either listening for myself saying the current time and date (which I've been doing for 30 years), or by looking for my briefly superimposing a visible time and date over the video (which I've also done for many years, where the camera provided this feature).