Tuesday 28 January 2014

When Perceptions are Everything, Community Profiling is the Key

Peter Cole


Principal Consultant





The UK’s major infrastructure engineering industry is currently enjoying a relative boom time, with a large number of proposals being progressed at a rate of knots for a range of projects, from high speed rail and major new energy infrastructure, to super sewers and significant airport expansion.
The various planning mechanisms that have been put in place to deal with these applications stress the importance of public engagement and early consultation with potentially affected communities. It is therefore perhaps not a surprise that the socio-economic assessments traditionally carried out to inform these applications have morphed so as to provide a stronger focus on the ‘softer’ non-economic impacts of a scheme upon a given community. Increasingly, these impacts are meriting their very own assessment and topic chapter within an Environmental Statement – the ‘Community Impact Assessment’.

Recognising that this is a relatively new science, just how effective are these assessments at really getting to the heart of the matter, or, the heart of the community?
Defining the sensitivity of a community to any particular impact isn’t just a case of looking at local demographic statistics and counting user numbers. Where impacts are likely, engagement should actually be part of the assessment methodology. It is important to realise that a communities ‘perception’ of impacts may be very different from the ‘actual’ impacts as normally assessed in an Environmental Impact Assessment (EIA). Understanding perceived impacts, however, is key within a Community Impact Assessment in terms of engaging positively with a community, applying the correct sensitivity to resources and in identifying the most acceptable mitigation.

There’s no hard and fast methodology for applying this principle to Community Impact Assessment, as clearly the approach needs to fit the project, the geographical area and crucially the characteristics of the communities likely to be affected. Having said that, in the majority of cases, face to face engagement would seem to be a given, with resident/user questionnaires and interviews with local community organisations being a good starting place, complementing demographic characterisation through desktop research.
As discussed, the process can be utilised by developers to ensure that they have considered (and are fully appraised of) a community’s sensitivities and priorities when developing alternatives and mitigation/compensation measures. In addition, Temple Group have also recently used this approach to aid local authorities who wish to understand the potential implications of major infrastructure proposals on their communities - giving the local authorities a sound basis for challenging assumptions within developers’ Environmental Statements.
Temple Group is now looking to take the Community Profiling concept one step further and develop uses and benefits which can be utilised by contractors post-consent. By reviewing mitigation requirements at the pre-construction stage, a contractor may be able to sense check these requirements (given some time may have passed since the requirements were originally conceived) and therefore avoid costly ineffective actions or disruption to work programmes because of poor relations with local stakeholders.

Wednesday 22 January 2014

Procuring Value

Martin Gibson


Technical Director, Sustainability


Last week I was lucky enough to be in a forum discussing leadership in sustainable construction. It included sustainability leaders from some key construction companies. We covered a lot of ground but one of the key issues that came up was that most clients decide bids on the lowest capital cost. 
A few years ago this is what you would have expected in construction. A low cost bid wins the contract and then the winner spends the term of the contract making claims for things that weren’t covered. The success of the contract would be measured by how close the final cost was to the original budget and whether the work delivered was close to what the client had asked for.

However, things should have moved on by now!

All the talk of the past few years has been of procuring the best value. Cost is only one aspect of value.

The value should include how well a building or project delivers its intended outcome. For example, a high value new school will help children to achieve higher grades, will have low running costs and will help teachers to perform to their optimum level. It will do this over its entire lifetime. The school with the cheapest capital cost is unlikely to be the one that achieves this! The cost to society of a school underperforming is likely to be many times higher than a slight difference in capital costs between a standard and an exceptional building.

Unfortunately, there are lots of barriers to procuring the best value. One of them is inertia: it is easier to do things the way they have always been done. Another is that it is often very hard to measure value. In the example of the school, determining the value of the performance of a school is difficult and will require the collection and analysis of a range of data over a long time. This is an area where there will be a lot of benefits from collecting and using the so-called ‘big data’ that we can now store and analyse. However, we aren’t doing much of that yet.

Is it time for a new business model – the government has tried investment models on the rehabilitation of prisoners (http://www.bbc.co.uk/news/uk-11254308). Could we ask private finance to invest in the extra capital cost during construction for exceptional schools? Can the benefits to society be quantified and then a model of return identified?

It seems as if we aren’t yet close to having the approaches needed to really procure the best value. However, we should start trying. Tools such as Geographical Information Systems (GIS) and Building Information Modelling (BIM) will have an important part to play in this but how far can the models go? Robust evidence of future performance which factor in return to society would enable better investment decisions. Whether or not you use such tools, the important first step is to understand the value of the outcome you are looking for. Then you need to work out which data can verify that you are achieving it.

So the next time you are looking at procurement, think carefully about what you are trying to achieve and what you need to measure. The lowest capital cost may also give the lowest value.

Friday 10 January 2014

Developing Agreement Over Noise

Simon Perry


Principal Consultant - Acoustics

An article in the Evening Standard on January 7th outlined a consensus approach to reducing noise nuisance. The Ministry of Sound night club in Elephant & Castle has apparently come to an agreement with Englewood, the developers of a new 41 story tower (Eileen House) that will allow it to maintain its 24 hour music licence.
The introduction of new noise sensitive developments, such as residential properties, in an area with existing commercial or industrial noise sources inevitably comes with the risk that the new sensitive uses may result in a restriction in trade for the surrounding noise generating businesses.
The planning process in the UK, when appropriately applied, is designed to protect existing uses.  In cases such as this, during the planning process the developer must ensure that the proposed development does not present a risk of restriction in trade; as such mitigation of the existing noise sources should be undertaken by the developer.
To protect against any potential noise nuisance generated, the level of noise received in the proposed properties must be carefully controlled.  This can be achieved by noise control at source (e.g. upgrade to the building fabric or reduction in noise generated) and by careful design of the new development (layout and building façade composition).
Englewood has agreed to the second approach with the Ministry of Sound. They are going to incorporate enhanced sound insulation in to the building facade and other noise reduction features such as ‘winter gardens’. This will mean that an existing business can continue to be successful and new residents will have an acceptable internal noise climate.
At Temple, we have often helped developers to monitor and manage their noise levels to help avoid potential nuisance occurring. We have also helped to ensure that new developments are suitably design to ensure acceptable internal noise levels are achieved. This usually involves helping the developers and local councils to understand the noise issues and the different mitigation options available. It is good to see other examples of where good communication helps avoid noise nuisance. We look forward to helping others to adopt this type of positive approach.

Wednesday 8 January 2014

Artificial Lighting - Friend or Foe?

Rob Lockwood
 
Technical Director


What would we do without it?  In the right place and at the right time, artificial lighting has real benefits. It could be street lighting, security lighting, decorative lighting or advertising and can help improve safety, prevent crime and facilitate a lively night-time economy. However, the flipside is that, if it is not properly managed, it can also have a wide range of harmful effects. These harmful effects include pollution of our night skies, annoyance to neighbours, impacts on wildlife and also wasted energy.

Yesterday Defra published their progress report on the steps they and other Government departments have taken to address the issue of light pollution following the Royal Commission on Environmental Pollution (RCEP) 2009 report.  The report can be found here:

https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/269402/pb14108-artificial-light-progress-dec2013.pdf

As Technical & Policy Advisor to Defra (2008-2013), Temple are very proud to have played a key role in developing and evolving this area of government environmental policy and making key contributions to the publications outlined in the report.  During that time we also built strong working relationships with key stakeholders such as the Campaign to Protect Rural England, Campaign for Dark Skies and the Institution of Lighting Professionals. We hope these good relationships continue to bring improvements in the future.

Tuesday 7 January 2014

Putting time on the map

Stephen Bell
Senior Consultant (GIS)
 
With the increasing amount of data available, it can be hard to make sense of it. GIS helps to make sense of data through mapping and spatial analysis and, to a lesser degree, visualisation of time based data.  This article shows that you can increase your temporal data capabilities by utilising Python.
For those engaged in the Geospatial world, mentioning the fact that the temporal dimension is less catered for in GIS compared to its spatial cousin isn’t overly insightful.  Over recent years steps have been made to include more in the way of time data management and visualisation, but with time a fundamental element of geographic processes, temporal tools are still too sparse.

If access to relevant tools or functionality is not possible then sometimes you have to get creative in order to handle and visualise data in a manner suited to your needs.
The Python scripting language is a powerful yet light-weight and user-friendly toolkit that can be used to help you understand your data more.  For example, the ClockCluster function in the Clock module has been developed in Python to visualise the temporal distribution of event data.  It reads time-based information from .csv files (a standard format for exchanging data) and, using the TKinter drawing module in Python, displays the information in a 'clock format', grouping the event data into 24 1 hour boxes displayed as a clock.  This allows the change of the data over a 24 hour period to be visualised (light to dark displaying low to high event volumes) in order to see where high and low clustering exist.

The example below displays the temporal patterning of emergency calls to the London Fire Brigade (LFB) in 2009; the dataset (freely available from the London Datastore: data.london.gov.uk) contains over 135,000 records encompassing the entire year.  Dealing with large datasets such as this can be daunting and time-consuming.  The ClockCluster function takes just a couple of seconds to read all the temporal data and visualise it appropriately.
The only input required from the user is to specify the .csv file and the field within it that has the time information.
With the am hours on the outside and the pm hours on the inside we can clearly see that fewest calls received by LFB occur between 4 and 6pm, whilst most of the morning sees high volume of calls with the peak between 6 and 7am.

Python can be very powerful for the manipulation, analysis, and visualisation of data, without the need for expensive software, and here at Temple we're big supporters of it.