late for work running

The Kinetic Museum

Hey, so apparently I have a blog! Who knew? At any rate, it looks like I’m going to be speaking at this year’s MuseumNext conference (travel budget permitting) in Barcelona, where I’ll be joining Nancy Proctor, Nate Solas, Robin Dowden, Hein Wils, Ferry Piekart, and lots of other museum smartsies for several days of kicking presentations and conversations. I’ll try and fill this out in greater detail later, but for now, here’s what I’m planning on talking about…
———-
How much of museums’ total overall effort is bound up in potential? How much time do museums waste defining “best practices” instead of simply moving ahead with a solution that just works? Because the museum as it exists today is still essentially built on the 19th-century model, changes in practice still tend to evolve over years, if not decades. In a culture that now evolves at web-speed, the pace of museums’ own evolution is fundamentally unsustainable, if not suicidal.

Digital and technology practice in museums has, like a jet plane strapped to a hand cart, been artificially grafted onto this ancient model, with checkered results. Technology has been used by museums primarily as a tool of efficiency (produce label copy out of our CMS, stat!) or of strained relevancy (participatory culture! gamification!), rather than as a foundational concept. But what if this weren’t the case? What if a museum’s overall practice were built outwards from its technology efforts, rather than the other way around? What would a museum built from the ground up for speed and agility, rather than stability and longevity, look like? This presentation will speculate on this idea by examining the possible evolution of museum practice from a number of perspectives, including (but not limited to):

  • Scholarship and Content Development: What would the equivalent of GitHub look like for scholarship? How could museums leverage the work of hundreds of thousands of curators and scientists working together towards a common repository of knowledge, rather than duplicating efforts from museum to museum?
  • Variable Media Conservation: Artists are inventing, implementing, and discarding means of creating works of art orders of magnitude faster than conservation practice is evolving. How can the practice of conservation change to accommodate web-speed innovation?
  • Constituent Software Systems: Collections management, digital asset management, development, and other primary museum software systems are generally built on a cataloguing paradigm, with distribution, publication, and collaboration tacked on as “premium features,” when present at all. How would systems built for action and outcome, rather than simply cataloguing, change the practice of museums from the ground up?
  • Staffing: Digital media teams tend to be a tiny minority on an average museum’s staff, even though they are responsible for the vast majority of the museum’s  interactions with the public. What would be the effect of inverting this model?

This presentation will pose many more questions than it will answer, but in so doing, will suggest new frameworks of understanding as attendees work towards building the museum of the future.

See you there, kids!

 

Obligatory sharing icons:

5 thoughts on “The Kinetic Museum

  1. Devil’s advocate for wasting time on “best practices”: we just rebuilt the backend for the web component of our cellphone-based audio tour, and it took me about 2 seconds to find the TourML / TAP spec and realize my work was done. That’s how “best practices” / shared standards are supposed to work — even when it’s not an “official” spec, I know a lot of smart brains have done good work. Why should I make my own version of an audio tour data model?

    Or maybe that’s what you meant? My point, I guess, is that we *should* burn some time defining best practices, as long as we share it very publicly. And, ideally, *someone else* should burn that time…. (Thanks IMA Labs! 😉

    The rest of it sounds stellar. Can’t wait.

  2. As someone who just spent the better part of a year trying to fix a predecessor’s trashed data (thanks contractors!), I totally respect the use of best practices when those practices are clearly defined. Most of the really stupid and frustrating things we run into as developers are caused by people ignoring the “best practices” of thirty years ago. And the rest of our frustrations are caused by people not being up to speed with “best practices” that were figured out within the last couple of years.

    I think what Koven is getting on about is those areas where “best practices” haven’t been defined yet; the new territory that we’re just now starting to explore. I think he means to promote the idea of putting working, publicly accessible prototypes for things that don’t have best practices yet out into the wild so we can talk about “best practices” from a practical rather than a purely theoretical standpoint.

    I like the idea of dangerous experimentation in moderation. It leads to what we should be doing rather than what we think we should be doing. Too often we get together and try to define best practices only to discover we were answering the wrong question.

  3. As usual, Nate finds the weakness in my argument 😉 But yes, Matt, your clarification is right on the nose. Where a best practice is a known quality and it’s more efficient to embrace it rather than ignore it, I’m all for it. 

    So maybe “best practices” is a bit of a misnomer–I was thinking more in terms of the research project we’ve all had to do at one time or another: “Go out and see how twenty other museums do this thing that you’re proposing before you get started, even though recovering from a bad decision will take less time than the research itself takes.”

    Don’t think I won’t be stealing “dangerous experimentation in moderation” for the presentation, either. 😉

  4. There are some examples of best practices that are usually stuck (meaning not necessarily searchable from Google) in grant reports and white papers from places like NEH, IMLS, and NSF.  When applying for any of those grants, you also need to do a survey of what is out there and usually the grant committees can tell whether folks have really done a good scan of what is out there as models or as something not to replicate. Those “environmental scans” if you will should also be available in grant applications.

    NEH and IMLS are getting better about publishing this material but it is definitely a good place to start when thinking about big issues. 

    There are also folks in the digital humanities sector researching some of these issues as well, especially with regard to preservation and access–libraries and archives are really ahead of museums on this, as I’m sure you know. Preserving Virtual Worlds might have good advice on preserving digital art forms (was a big grant from Library of Congress and a few universities): http://pvw.illinois.edu/pvw/

    Enough of my ramblings for now. You’re asking good questions!

Leave a Reply

Your email address will not be published.