Beyond “warm impulses”

I’ve been catching up on the Museopunks podcast series, and a section of March’s installment, the Economics of Free, particularly caught my attention. In an interview, director of the Dallas Museum of Art, Maxwell L. Anderson compares the data that shopping malls collect about their customers to the relative paucity of data that is collected about visitors to the typical art museum. I think it’s worth repeating (from about 18min into the podcast):

[Malls] know all this basic information about their visitors. Then you go to an art museum. What do we know? How many warm impulses cross a threshold? That’s what we count! And then we’re done! And we have no idea what people are doing, once they come inside, what they’re experiencing, what they’re learning, what they’re leaving with, who they are, where they live, what interests and motivates them . . . so apart from that we’re doing great, you know. We’re like that mall that has no idea of sales per square foot, sales per customer. . . so we’re really not doing anything in respect to knowing our visitors. And learning about our visitors seems to me the most basic thing we can do after hanging the art. You know, you hang the art, and then you open the doors and all we have been doing is “hey look there are more people in the doors”.  And the Art Newspaper dedicates an annual ‘statistical porn’ edition of how many bodies crossed thresholds. Nobody’s asking how important the shows were, or what scholarly advances were realised as a function of them, or what people learned, how they affected grades in school. Nobody knows any of that. Nobody knows who the visitors were. So I consider it a baseline. We’re just at the primordial ooze of starting to understand what museums should be doing with this other part of our mission which is not the collection but the public.

I’d argue that we’re a little bit beyond the ‘primordial ooze’ stage of understanding*, although Anderson’s right in that many museums don’t go much beyond counting ‘warm impulses’ (those infra-red people counters). He goes on to describe how the DMA’s Friends program is giving the museum more data about what their visitors do while inside the museum, and how this can inform their engagement strategies (22:45):

This is just another form of research, you know . . . we do research on our collections without blinking an eye, we think nothing of it. We spend copious amounts of time sending curators overseas to look at archives to study works of art but we’ve never studied our visitors. The only time museums typically study their visitors is when they have a big show, and they’re outperforming their last three years, everybody’s excited, and there’s a fever, and you measure that moment, which is measuring a fever. The fever subsides, the data’s no longer relevant but that’s what you hold on to and point to as economic impact. And largely, it’s an illusion.

I find it interesting that Anderson puts visitor research on a par with collection-based research. Often, I get the sense that collection research is seen as ‘core’ museological business, but visitor research is only a ‘nice to have’ if there is the budget. But perhaps this is a sign of shifting priorities?

 

*Historically, most visitor experience research has taken place in science centres, children’s museums, zoos and aquariums rather than museums of fine art. Although there are of course exceptions.

Young Adults and Museums

It’s always exciting when your research data throws up something counter-intuitive. Or at least something that’s at odds with “conventional wisdom” on the subject.

One such piece of wisdom about museum visitors is that young adults (particularly those aged under 25) tend not to visit museums. Population-level statistical data tends to back this up, with a characteristic dip in the 18-24 age bracket (see this graphic from a previous post):

Attendance by age, using figures from Table 1.4 in ABS report
Heritage visitation in Australia by age. Percentage of respondents who visited a heritage site in the previous 12 months (Source: ABS)

Now, here is the age breakdown of the respondents to my visitor survey conducted at the SA Museum as part of my PhD research:

Age Range

Not only are visitors aged under 30 not under-represented, they form the biggest age group I surveyed by a considerable margin! This is a surprising (albeit incidental) finding from my research which makes me wonder what’s going on here. Based on what I observed at the Museum during my fieldwork I have come up with the following hypotheses:

  • Proximity to university campuses. The SA Museum is right next door to Adelaide University and not very far from one of the main campuses of the University of South Australia. I got into conversation with a couple of groups of young adults who indicated they were visiting the museum to kill time between lectures.
  • The backpacker factor: The SA Museum is a popular destination with both interstate and international visitors (more than half of my sample indicated they were visiting the Museum for the first time, and I would wager that the majority of these people were tourists). Among the survey sample, there appeared to be considerable numbers of young “backpacker” tourists (based on my fieldwork observations). Anecdotally, it appeared that younger international tourists were less likely to experience the language barriers of older tourists, which would have prevented them from participating in the study (about 7% of the visitors I approached to complete a survey had limited or no English).
  • Free and centrally located: a few people indicated they were in the museum because it was free to enter and a way of escaping the heat or rain. There were a couple of people who were waiting for someone with a hospital appointment (the Royal Adelaide Hospital is just down the road). Of course, they could have also spent this time in the shopping malls which are just across the road – but for some reason chose not to. So there is clearly some other characteristics of the museum that are attractive to them but which were beyond the scope of this survey. Others appear to have been ‘doing’ the precinct, visiting the Art Gallery of South Australia (next door) as well as the museum.
  • Young parents: A fair proportion of those in the 18-29 age group were accompanying young(ish) children. I don’t know if it’s just me, but I sense there has been a demographic shift between Generations X and Y. Most people of my (Gen X) vintage seemed to be well into their thirties before they settled down and started families. I suspect Gen Ys are having children younger, for a whole range of complex reasons which are beyond the scope of this post. This is just a gut feeling though – I haven’t cracked open the data.
  • Young couples: There was a surprising proportion of young (and highly demonstrative!) couples around. The museum as a date venue?
  • Patterns in the smoke: There is of course the possibility that this cluster is just a random quirk of my particular data set. However, the surveys were conducted across weekdays, weekends and public holidays (but not school holidays) to help control for variation in visiting patterns. My fieldwork observations show nothing to indicate that 18-29 year olds were more likely to agree to complete a survey than other age groups.

In retrospect, it would have been good if I’d been able to distinguish between the under and over 25s by splitting the age ranges the way the ABS do (I had a reason why I didn’t but in any case it’s no big deal). However, I went back to a pilot sample from late last year and found the age spread using different categories was broadly similar:

Pilot Age Range

So what does all this mean? I’m not sure yet. Age is not expected to be a significant variable in my own research, and I only collected very basic demographic information so I had a general sense of the survey population. I’d be interested in how this tallies with other museums though, particularly those that are free as opposed to ticketed entry. Ticketed venues tend to collect more comprehensive visitor data, and we tend to extrapolate from that. But perhaps they are not fully representative of museums as a whole?

Survey Responses – Benchmarks and Tips

I’ve now collected a grand total of 444 questionnaires for my PhD research (not including pilot samples) – which is not far off my target sample of 450-500. Just a few more to go! Based on my experiences,  I thought I’d share some of the lessons I’ve learned along the way . . .

Paper or Tablet?

My survey was a self-complete questionnaire (as opposed to an interviewer-led survey) that visitors filled out while on the exhibition floor. During piloting I tried both paper surveys and an electronic version on an ipad, but ended up opting for the paper version as I think the pros outweighed the cons for my purposes.

The big upside of tablet based surveys is that there is no need for manual data entry as a separate step – survey programs like Qualtrics can export directly into an SPSS file for analysis. And yes, manually entering data from paper surveys into a statistics program is time-consuming, tedious and a potential source of error. The other advantage of a tablet-based survey (or any electronic survey for that matter) is that you can set up rules that prompt people to answer questions they may have inadvertently skipped, automatically randomise the order of questions to control for ordering effects, and so on. So why did I go the other way?

First of all, time is a trade off: with paper surveys, I could recruit multiple people to complete the survey simultaneously – all I needed was a few more clipboards and pencils and plenty of comfortable seating nearby. Whereas I only had one tablet, which meant only one person could be completing my survey at a time. By the time you take into account the time saved from being able to collect far more paper surveys in a given time compared to the tablet, I think I’m still in front doing the manual data entry. Plus I’m finding doing the data entry manually is a useful first point of analysis, particularly during the piloting stages when you’re looking to see where the survey design flaws are.

Secondly, I think many visitors were more comfortable using the old-fashioned paper surveys. They could see at a glance how long the survey was and how much further they had to go, whereas this was less transparent on the ipad (even though I had a progress bar).

This doesn’t mean I would never use a tablet – I think they’d be particularly useful for interviewer-led surveys where you can only survey one participant at a time anyway, or large scale surveys with multiple interviewers and tablets in use.

Refining the recruitment “spiel”

People are understandably wary of enthusiastic-looking clipboard-bearers – after all, they’re usually trying to sell or sign you up to something. In my early piloting I think my initial approach may have come across as too “sales-y”, so I refined it such that the first thing I said was that I am a student. My gut feel is that this immediately made people less defensive and more willing to listen to the rest of my “spiel” for explaining the study and recruiting participants. Saying I was a student doing some research made it clear up front that I was interested in what they had to say, not in sales or spamming.

Response, Refusal and Attrition Rates

Like any good researcher should, I kept a fieldwork journal while I was out doing my surveys. In this I documented everyone I approached, approximately what time I did so, whether they took a participant information sheet or refused, and if they refused, what reason (if any) they gave for doing so. During busy periods, recording all this got a bit chaotic so some pages of notes are more intelligible than others, but over a period of time I evolved a shorthand for noting the most important things. The journal was also a place to document general facts about the day (what the weather was like, whether there was a cruise ship in town that day, times when large numbers of school groups dominated the exhibition floor, etc.). Using this journal, I’ve been able to look at what I call my response, refusal and attrition rates.

  • Response rate: the proportion of visitors (%) I approached who eventually returned a survey
  • Refusal rate: the proportion of visitors (%) approached who refused my invitation to participate when I approached them
  • Attrition rate: this one is a little specific to my particular survey method and wouldn’t always be relevant. I wanted people to complete the survey after they had finished looking around the exhibition, but for practical reasons could not do a traditional “exit survey” method (since there’s only one of me, I couldn’t simultaneously cover all the exhibition exits). So, as an alternative, I approached visitors on the exhibition floor, invited them to participate and gave them a participant information sheet if they accepted my invitation. As part of the briefing I asked them to return to a designated point once they had finished looking around the exhibition, at which point I gave them the questionnaire to fill out [1]. Not everyone who initially accepted a participant information sheet came back to complete the survey. These people I class as the attrition rate.

So my results were as follows: I approached a total of 912 visitors, of whom 339 refused to participate, giving an average refusal rate of 36.8%. This leaves 573 who accepted a participant information sheet. Of these, 444 (77%) came back and completed a questionnaire, giving me an overall average response rate of (444/912) 49.4%. The attrition rate as a percentage of those who initially agreed to participate is therefore 23%, or, if you’d rather, 14% of the 912 people initially approached.

So is this good, bad or otherwise? Based on some data helpfully provided by Carolyn Meehan at Museum Victoria, I can say it’s probably at least average. Their average refusal rate is a bit under 50% – although it varies by type of survey, venue (Museum Victoria has three sites) and interviewer (some interviewers have a higher success rate than others).

Reasons for Refusal

While not everyone gave a reason for not being willing to participate (and they were under no obligation to do so), many did, and often apologetically so. Across my sample as a whole, reasons for refusal were as follows:

  • Not enough time 24%
  • Poor / no English: 19%
  • Child related: 17%
  • Others / No reason given: 39%

Again, these refusal reasons are broadly comparable to those experienced by Museum Victoria, with the possible exception that my refusals included a considerably higher proportion of non-English speakers. It would appear that the South Australian Museum attracts a lot of international tourists or other non-English speakers, at least during the period I was doing surveys.

Improving the Response Rate

As noted above, subtly adjusting the way you approach and invite visitors to participate can have an impact on response rates. But there are some other approaches as well:

  • Keep the kids occupied: while parents with hyperactive toddlers are unlikely to participate under any circumstances, those with slightly older children can be encouraged if you can offer something to keep the kids occupied for 10 minutes or so. I had some storybooks and some crayons/paper which worked well – in some cases the children were still happily drawing after the parents had completed the survey and the parents were dragging the kids away!
  • Offer a large print version: it appears that plenty of people leave their reading glasses at home (or in the bag they’ve checked into the cloakroom). Offering a large print version gives these people the option to participate if they wish. Interestingly, however, some people claimed they couldn’t read even the large print version without their glasses. I wonder how they can see anything at all sans spectacles if this is the case . . . then again, perhaps this is a socially acceptable alibi used by people with poor literacy levels?
  • Comfortable seating: an obvious one. Offer somewhere comfortable to sit down and complete the questionnaire. I think some visitors appreciated the excuse to have a sit and have a break! Depending on your venue, you could also lay out some sweets or glasses of water.
  • Participant incentives: because I was doing questionnaires on the exhibition floor, putting out food or drink was not an option for me. But I did give everyone who returned a survey a voucher for a free hot drink at the Museum cafe. While I don’t think many (or any) did the survey just for the free coffee, it does send a signal that you value and appreciate your participants’ time.

[1] A potential issue with this approach is cuing bias – people may conceivably behave differently if they know they are going to fill out a questionnaire afterwards. I tried to mitigate this with my briefing, in which I asked visitors to “please continue to look around this exhibition as much or as little as you were going to anyway”, so that visitors did not feel pressure to visit the exhibition more diligently than they may have otherwise. Also, visitors did not actually see the questionnaire before they finished visiting the exhibition – if they asked what it was about, I said it was asking them “how you’d describe this exhibition environment and your experience in it”. In some cases I reassured visitors that it was definitely “not a quiz!”. This is not a perfect approach of course, and I can’t completely dismiss cuing bias as a factor, but any cuing bias would be a constant between exhibition spaces as I used comparable methods in each.

(More) Museum and Gallery Visits in England

Back in late 2011 I posted a summary of the latest Taking Part survey of participation in Arts, Sport and Heritage in England. Late last year figures for the period spanning October 2011 – September 2012 were released by the Department for Culture, Media and Sport (DCMS).

Since 2005, when the survey began, these figures have reported an upward trend in the number of people who had visited a museum or gallery in the previous year. For the first time, that proportion has gone above 50%: 51.6% in the most recent survey period. This gets as high as 57.5% in London, with the West Midlands trailing at 48.5%. Despite these variations, all English regions are seeing an increase in visitation.

Online participation is also growing, but still has a long way to go before it catches up to physical visitation – 28.7% of respondents had visited a museum website in the previous year (up from a mere 15.8% back in 2005/6).

Participation rates remain higher in the white and upper socioeconomic demographic groups. However the rise in participation by non-white and non-Christian people continues, with participation rates of 48.4% and 46.0% in the Black and Minority Ethnic and non-Christian religious communities respectively. (This compares with 35.4% and 35.3% in 2005/6). Participation rates are also rising across the socioeconomic spectrum.

Participation rates are the lowest among the over 75s (27%) and those living in socially rented housing (27.9%). However, in 2005/6 the participation rate among social renters was 24.9% – this represents a statistically significant increase. (Participation rates among the 75+ demographic remain steady).

Want to explore further? Summary reports as well as the raw figures (in excel format) are available from the DCMS website.

Children in Museums and Galleries

The latest version of the Australian Bureau of Statistics’ report Children’s Participation in Cultural and Leisure Activities has recently been released. This report shows participation rates in a range of cultural, recreational and sporting activities by children aged 5-14 inclusive. I’ll focus on museum and gallery visitation here, although for comparison I’ve included public libraries and performing arts attendance in following table:

Children’s attendance at cultural venues and events in preceding 12 months (Source: ABS)

This shows that 70% of children aged 5-14 attended at least one library, museum, gallery or performing arts event in the preceding 12 months. Note the figures pertain to activities undertaken outside of school hours, so this does not take into account school visits. The increases in performing arts and museum and gallery attendance are statistically significant, at least when comparing 2012 to 2006.

The following table breaks down the frequency of attendance among participants:

Frequency of attendance (in past 12 months) among participating children (Source: ABS)

Frequency of participation in museums and galleries is comparable with that of performing arts among all age groups, whereas visitation to libraries is more frequent (which makes sense given the nature of library use).

Museum and gallery attendance is not uniform throughout the community, however. Children from non-English speaking countries and non capital city residents are less likely to attend. There are also slight variations according to gender and age bracket, as well as differences by state of residence. The state differences may be at least partly explained by the fact that some states have more of their population concentrated in capital cities than others.

Source: Children’s Participation in Cultural and Leisure Activities, Australia, Apr 2012 (ABS)

It’s hard to compare these figures directly to the ABS figures for adults, as museums and galleries are reported separately for adults. (Recent ABS statistics for museum and gallery visiting are reported here.)

Who’s visiting (now)?

Late in 2010, I wrote a series of posts based on the Australian Bureau of Statistics report: Arts and Culture in Australia: A Statistical Overview, 2010.

It looks like the 2011 version of this report was issued just before Christmas, although I only found out about its release a few days ago. So I thought I’d look at the 2011 report and compare it to the 2010 figures I blogged about previously, to see if there are any interesting changes (or conversely, evidence of stable patterns).

The first post I’m revisiting is Who’s visiting?, which looked at participation rates by age. (‘Participation rate’ is defined as the person having attended that kind of venue least once in the previous 12 month period). Now it looks like the participation rates shown in the 2010 report  were based on data from 2005-2006, whereas the 2011 report has more recent figures (2009-2010). So what has happened to participation rates over the past five years?

Firstly, let’s look at the overall participation rates from each year (NB: the ABS report also includes libraries, archives and performing arts, but these are not included in this analysis):

Attendance rates at Australian cultural venues (people aged 15 or over), as a total figure and as a percentage of the population (Source: ABS).

So it appears that participation rates are increasing across the board, albeit modestly (and the report does not say whether this increase is statistically significant or not). This increase appears to be spread across the age ranges:

Attendance rates at cultural venues by age group, comparing 2005-2006 and 2009-2010 (Source: ABS)

So there is no radical change in any particular age group, and the patterns of participation follow broadly the same patterns in both 2006-2006 and 2009-2010. Similarly to previous years, the report also showed that women are still slightly more likely to be visitors than men. So there is nothing earth shattering, but perhaps there is something to be quietly optimistic about if the increased participation rates are evidence of a slow and steady trend.

 

 

Exhibition Attendance Figures: The Art Newspaper

If you’re wondering just how big those blockbuster crowds are, then the annual summaries published by the Art Newspaper are a great place to start.

These comprehensive lists show both overall and daily attendance figures for (predominantly) art exhibitions around the world, ranging from the 10,000+ daily visitors to the mega-blockbusters to far more modest ventures. There are also top ten list by category, such as “Decorative Arts” or “Impressionists” or “Antiquities”.

There are several years of data available online, going back to 2003 (links posted below):

That represents a fair data set for spotting longer term trends or changes over time. Looking at this data, I was interested to see how often exhibitions in Japan top the list, with exhibitions in Tokyo and elsewhere often dominating the top ten. This is in contrast to the most visited museums overall, which are predominantly the big institutions in London, Paris, New York and Washington.

Looking at the Australian scene, the top exhibition for 2010 was the 6th Asia Pacific Triennial at Queensland Art Gallery (34th overall).  Both Queensland Art Gallery and its sister institution, GoMA, managed more than one entry in the top 100 as well as additional entries under specific categories.  Other Australian exhibitions to make the list were Masterpieces from Paris at the National Gallery of Australia and the Tim Burton exhibition at the Australian Centre for the Moving Image.

The 6th Asia Pacific Triennial attracted some 4,400 people a day. That sounds like quite a crowd, until you compare it to the top exhibition overall – Hasegawa Tohaku at the Tokyo National Museum – which attracted over 12,000 people per day!

Benchmarking Museums: Online & Onsite

If you’re interested in which museum is doing what in social media, then you must check out Museum Analytics.

It describes itself as “an online platform for sharing and discussing information about museums and their audiences”. So far there is data for over 3000 museums, including some of the world’s most famous such as MoMA, the Louvre, Tate, and the Smithsonian. (But it’s not just the big global museum brands – I counted at least 20 Australian museums, ranging from the major institutions to a wide range of small and regional ones).

The site lists the most visited museum websites (Metropolitan Museum of Art by a fair margin in 2010 it appears) and the top Facebook likes and Twitter follows. Museums are also individually listed and you can see what’s happening closer to home – for instance these are the summaries for Australia and Adelaide respectively. But it’s not just website and social media – the site also has numbers for onsite visitors as well (although it is the data about online activity that makes this site stand out).

On the topic of museum statistics, there has recently been quite a lively discussion on the ICOM Group on LinkedIn (list members only but it’s easy to sign up) about the information and statistics collected by governments and other bodies around the world. If you’re interested in comparing and contrasting museum statistics from around the world (or even comparing which data are collected, by whom, and why), then I suggest you sign up.

One resource I was directed to from the ICOM discussion was a Culture 24’s project about how to evaluate museum success online. You can download a detailed report about the research project as well as tools and metrics for evaluating online and social media presence. It’s a must if you’re getting to grips with tools like Google Analytics or just wondering how best to track your online presence.

 

 

Museum and Gallery Visits in England

Taking Part, which has been run since 2005, collects data about participation in sport, the arts, heritage, libraries, museums and galleries from adults and children (aged 5-15) in England. The figures show that visiting to museums and galleries is on a steady upward trend, with the increase in visitation / decline in non-visitation being statistically significant:

Trends in the proportion of adults in England who have visited a museum or gallery in the previous 12 months (source: Taking Part survey - .xls file available on website)

So, somewhere in the region of 42-46% of adults in England visit a museum or gallery at least once in a given year (and this doesn’t include Heritage sites, which are visited at least once a year by a whopping 70% of English adults).

This fairly steady overall picture conceals considerable variation by geography, demographics and socioeconomic status:

Age and geographic breakdown of museum and gallery participation rates, for the earliest and latest years available (full data set is annual). Figures in bold represent a significant change from 2005-6. (Source: Taking Part statistical worksheet (Museums))

Age and gender breakdowns are pretty self-explanatory, and broadly reflect Australian trends (although ABS uses slightly different age categories). London residents are the most likely to visit museums while those in the East Midlands (which incorporates my English ‘hometown’ of Leicester) are the least likely in 2010-11. Interestingly, the East Midlands is the only region to see a fall in participation rates from 2005-6, albeit not a significant drop. It would be interesting to see how the different regional increases correspond to the opening / refurbishment of museums across England over the past few years.

Demographic and socioeconomic data show that museum and gallery visitors are still disproportionately white, wealthy and able-bodied:

Demographic and socioeconomic breakdown of museum and gallery participation rates, for the earliest and latest years available (full data set is annual). Figures in bold represent a significant change from 2005-6. (Source: Taking Part statistical worksheet (Museums))

Participation rates among lower socioeconomic groups, ethnic minorities, disabled people and people of non-Christian religions are all on the increase, which will be encouraging news for all those who have put so much effort into social inclusion projects in museums over the past decade or so. However, given the increases in participation across-the-board, it’s not clear whether there is any progress being made in closing long-standing cultural gaps.

 

 

NB: I tried to do a compare-and-contrast between the Taking Part report and the Attendance at Cultural Venues statistics published by the Australian Bureau of Statistics, but I ended up tying myself in knots. First off, the ABS report cites participation rates for Art Galleries and Museums separately (with each being in the low-mid 20% range – see here for more details). Where a combined rate is given, it appears that the figure has been reached by simply adding Museum and Gallery figures together (See for instance Table 8.1 in the ABS report (PDF), despite saying in the explanatory notes that the true proportion will be less than the sum of Museums + Galleries due to overlap between the visitor populations of each – as you’d expect. I’m actually wondering whether the wrong numbers were published in the report!). Either way, I suspect the 46-52% annual participation rates cited by ABS are an overestimate.

 

Statistical snapshot of European Museums

Thanks to the ICOM group on Linked In, I recently found out about EGMUS: the European Group on Museum Statistics. The group exists to collect and publish comparable statistical data from 27 European countries. I’ve pulled out some of the statistics I thought of particular interest, but there are also statistics for funding, staffing and management (although these data sets look fairly incomplete at this stage).

The overall picture looks like this:

The countries in the EGMUS sample, the year the data for each country was collected, and number of museums in each country. (Those in red have some consistency issues which will become clear later)

Most European museums are open to the public for at least 200 days a year (a fair criterion for considering a museum to be a ‘public’ institution compared to a facility primarily for specialists or researchers). The major outlier is Switzerland at 14%, although Germany too has a fairly low proportion of ‘public’ museums by this measure.

Some (not all) countries have broken down their museums by type:

Breakdown of European museums by type. Figures in red are those with clear inconsistencies with the 'total museums' figure listed above. The reason for this discrepancy is unclear from the data.

Differences between the respective countries are clearer when the data are presented graphically:

European museums by country and type. Many countries have almost (or actually) exclusively Art, Archaeology and History museums. Switzerland, Germany and Luxembourg are the only ones to have mostly Science, Technology and Ethnology museums.

‘Art, Archaeology and History’ is quite a broad definition; probably too broad to give any detailed comparison between countries. Putting ‘ethnology’ in with science and technology also seems a bit weird to me and I wonder what the reasoning is for this.

It’s hard to compare to the Australian statistics, which use quite different definitions – Art Galleries (14%), Social History (60%), Historic Properties (21%), Natural, Science and Other (5%). (From ABS figures summarised in a previous blog post). However, I’d imagine that by the European definition most of Australia’s museums would also fall into the ‘Art, History and Archaeology’ category too.

The EGMUS figures also look at the number of visits per country and (on average) per museum:

Total and average number of visits to European museums. NB: Some of these are combined from multiple tables. Figures in blue are not directly cited in the EGMUS tables but were derived from other data provided. Figures in red showed internal inconsistencies in the are published data (beyond a 5-10% margin of error).

The clear outlier here is Switzerland, and I’m not sure if this is a typographical error or is in some way related to the very low proportion of museums that are open for more than 200 days a year. Even ignoring Switzerland, however, there are still considerable differences in the number of museums per head of population between European countries (which don’t seem to relate to geographical or socioeconomic differences between countries in any obvious way).

A lower proportion of visits to European museums are free compared to Australia, where an average of 68% of visits are free entry. Per-capita number of visits is comparable, however, with my back-of-the-envelope calculation for an Australian figure (taking ABS visitor stats cited above and assuming an Australian population of 22 million) being just shy of 140,000 visits per 100,000 inhabitants, with an average of roughly 26,000 visits per museum.