People ask us this all the time. I just got off the phone with some smart, smart people who are asking all the right questions about metadata formats for exchanging media content info between services.
The wisest thing I've ever heard on this topic was from a veteran of umpteen standardization committees in the audience of a conference panel. Unfortunately I did not catch her name, but she said something along the lines of:
"Deciding on a metadata format is like picking a database. The hard part comes next, in defining the fields/data model".
Ideally we'd have a small number of well standardized formats, each with a range of wonderful libraries and tools. In the early days of MetaBroadcast we thought it might be possible to get there quickly. We've stopped believing that.
The truth is that it's really easy to convert between metadata formats. URIplay converts between dozens, and we're pretty agnostic. The format is an operational issue, just like choosing a database. The main criteria is the ease with which a fresh developer can understand what's required.
The really difficult bit is getting the fields right. Building a user-friendly product is going to require a set of compulsory fields. In the case of VoD or AoD products, compulsory fields will typically include titles, description, pictures and a way to access the content. Unless you can deal with lots of gaps in your product, you have to specify this stuff, and content providers have to deliver. (Incidentally, this is one of the spots where RDF and the semantic web tend to fail. There's normally a lack of agreement on the compulsory fields, sidestepping the hard work of building an application from distributed data.)
The hardest fields to get right are those that allow the data consumer to identify an item of content. This is vital if you are going to figure out what's unchanged, old, or new as you update the content. Some standards do a poor job of identifying content. As a consumer of data you can't afford to get this wrong, and it can be really hard to explain to data producers.
So, back to the question. What is the best media metadata format? Well, there are three broad options:
- An "industry-strength" traditional standard – TV Anytime, MPEG 7, DAB EPG etc. Easy to understand they are not! But people expect profiles and rules to be applied. They look hard to handle, and they are hard to handle. That's OK if you're building something big and permanent, and telecoms/broadcast standards people will approve of your choice. But the world of the web thinks differently.
- Web standards – atom, media RSS, or more commonly a bastard cross-breed. People are familiar with these, so many will figure they're easy to handle. But they're not familiar with your set of compulsory feeds. So their standard feeds probably won't work, and their feed creation tools might struggle, too. Looks easy, probably has a sting in the tail. But at least you're building on something standard. You'll get some great generic tool support, but you're on your own for anything application specific, and you will probably have to define much more than if you started with #1 above. Still, your feeds will work in lots of generic readers too, and maybe your conventions are adopted some day?
- Roll your own – pick a simple base format that's easy to understand, like JSON. Reuse namespaces and field names where you can. You'll get basic tool support, ease of understanding, and clarity that this format requires special thought and development. Development effort for creator and consumer are probably similar to #2 above, but many people will accuse you of reinventing the wheel.
Maybe RDF and the semantic web will be added as a fourth option here, one day. These offer a possibility of really well described data that can be interchanged easily between many types of applications. But these bold aims require amazing tools, and levels of standardization that have not yet been achieved, and still seem a long way off. We consume and produce RDF from our systems, but most of our effort still goes elsewhere.
At MetaBroadcast we follow all of the three approaches set out above. None is perfect, and each is right in some situations. Big players should probably support several options.
Which would you choose, and why?
This was cross-posted from the MetaBroadcast blog.
Posted at 14:31 BST, 16th April 2010.
We're excited to launch Come Dine With Me Homemade, a project on which we've been working away for the last few months with the marketing, food and cross-platform teams at Channel 4, and in collaboration with ITV Studios.
As far as we're aware, this is the first time that any U.K. broadcaster has used Facebook Connect in support of a television show. The site allows viewers to make their own Come Dine With Me episode, using their own photos, adding custom soundbites from the show's fantastic narrator Dave Lamb, plus theme tunes and graphics.
Facebook and Come Dine With Me are natural partners - nearly 200,000 people are fans of the show on Facebook. A user of Come Dine With Me Homemade is often only a single click away from logging into the site, to then easily view friends' parties, and share content back to Facebook.
MetaBroadcast built the front and back ends of Come Dine With Me Homemade. Once things get into full swing, we expect to be handling very large numbers of simultaneous users, especially after a peak-time airing of the TV show. Each user receives a personalised social experience on the site, so we have made heavy use of Purple, our own broadcast-strength social integration system.
Purple previously backed Test Tube Telly, a prototype social TV guide commissioned by 4iP, and launched in summer 2009.
Many thanks to the extended C4 team who trusted us with this delicate and highly satisfying project, special recognition going out to Louise Brown (@louby), Julia Pal and Jane Honey, Andrew Pipes (@The_Pied_Pipes), Sarah Rogers and Stephen Hardingham. We couldn't have gotten this far without the help of Lisa Campana, Jamie Knight (@JamieKnight), Igor Volk, and of course John Ayrez (@ayrez) and Robert Chatley (@belgiano).
This was cross-posted from the MetaBroadcast blog.
Posted at 22:53 GMT, 17th February 2010.
Over the last six months I have been experimenting with sous vide cooking at home. It's a wonderful way to cook meat perfectly, with minimal hassle - truly the future of food.
If you want the details about sous vide, check out this guide, read up on the scientific facts in one of Harold McGee's fine books (amazon us, uk), or get a Chef's perspective in Heston Blumenthal's Big Fat Duck Cookbook (amazon us, uk), or Thomas Keller's Under Pressure: Cooking Sous Vide (amazon us, uk).
These tomes make fascinating reading, but many practical tips are missing. I'm planning to share my own experience in a series of posts here.
Before I get started, a word about safety. Sous vide often involves cooking bacteria-ridden food like raw meat at low temperatures. This can be very dangerous if it's done wrong. Before you start, make sure you understand the circumstances in which food pathogens are killed, and in which they multiply. Any of the references mentioned above contain the necessary information. I won't risk repeating it incorrectly here.
To start with, the basics:
Sous vide is wrongly named. It means 'under vacuum', but vacuum is often incidental. The main thing is the low temperature at which food is cooked.
Top restaurants use thousands of pounds of expensive equipment for sous vide cooking, but great results can be achieved without any special tools.
You can cook sous vide by putting a piece of meat in a pan of warm water, inside a zip-lock bag, and monitoring temperature with a digital probe to keep it reasonably constant. If the water is at 55°C you'll get medium-rare meat. It's that simple.
This is not so different from conventional cooking - we're always looking to get meat to 55°C for medium rare. But normally we try to do this in a pan or oven that is much hotter. So it's really difficult to get all the meat to a consistent temperature. Normally it's much hotter on the outside, the middle keeps cooking after it comes out of the oven. Unless you use meat of the same shape every time you'll be hard pressed to get the times and temperatures right with conventional methods.
Minimum cooking times is based mainly on thickness. A 2cm thick piece of meat needs at least 30 mins. Times go up fast as the meat gets thicker. Use a table to figure out the minimum time. There's one in this document.
There's no maximum cooking time. Heston Blumenthal says meat goes pappy if left for two long. But mere mortals can leave the meat in the water until they're ready to eat it. This makes life much easier, especially when cooking a complex meal, or for lots of people.
Brown meat tastes good. In most cases you'll want to brown the meat very quickly before or after sous vide cooking. A blow torch on meat brushed with oil is a fun way to brown lean meat like steak. But a hot pan is fine too.
A vac-pack machine is helpful to keep everything together. I use a "Seal a Meal" unit. The main benefit is that all the juices stay neatly inside the bag.
Brining and sous vide are natural partners. Mix a liter of water with 5 teaspoons of salt, then soak your meat in it overnight, before cooking. Try adding pepper, herbs and other spices to the water.
I'll try to post more tips and some extra details soon.
Have you tried sous vide? I'd love to hear other tips and tricks here, or on twitter
Update : This primer also looks rather interesting, especially the rather handy visual charts
Posted at 21:43 GMT, 13th February 2010.