To get a user-friendly tool

Image for post
Image for post

Those following our project will know that we want to listen to and get feedback from our users and potential future users through user testing. This will ensure that the tool we’re developing provides a quality experience and comfortable journey for its users, . The hope is the tool we are developing will bring the planning application service into the 21st century digital era. Our aim is to help users navigate the process, so that they do not need to search multiple websites to find out what information they need to submit their application.

To achieve this we have built a clickable prototype on Figma.

This allows us to design and test the digital experience. This will help us understand how users will engage with the new online tool we are developing and capture their feedback.

The sessions will take those volunteers through the submission experience using the clickable prototype. Members of the partner project team will observe their responses to the session questions and note down the feedback the volunteers give around how they findtool, what works well, what the experience is like for them and what they think could be improved.

Those who have been following our blogs will know from the last post we have also been calling out on our twitter and Facebook accounts as well as those from the Alpha phase (largely Lambeth based volunteers). We’ve still not been able to access volunteers via the information held on the partner council databases due to their Information Governance teams reticient around agreeing that we can use the Legitimate Interest Assessment approach — we remain hopeful of a change in mind in the near future.

A new find, however, has been the discovery of the social site https://nextdoor.co.uk

We’ve set up trial accounts to test this as an option. This social media site is is more localised and allows ‘street level’ targeting so there is a hope that the reach will be improved. Since the last post where we had one confirmed volunteer, we have received a further nine responses from people interested in being involved in a usability testing sessions. Seven sessions have been confirmed in the next two sprints.

· Five user testers have submitted a planning application within the past two years. This will hopefully allow us to find out how these users feel about the new system when comparing their experience with other systems they used to summit their applications.

Image for post
Image for post

· Three of our user testers are residents

· Four users are agent

· One is property developer.

Image for post
Image for post

We want our tool to work well for users with different levels of planning knowledge and background. Therefore, it needs to satisfy both developers who had a good understand of planning and experience of submitting applications as well as those users with little or no planning knowledge.

Image for post
Image for post

Following on from our last blog — we had the feedback from our first user testing session, an interested Lambeth resident.

The key notes and findings from that session

· Overall, our user understood the purpose of the site that we are working on, and was able to navigate and progress with the prototype with no major difficulty.

· The participant had a minor problem navigating into the ‘I want to’ options of the tool and the ’Task list’ section.

Image for post
Image for post

· The participant didn’t understand how the list function could be opened out to show the various options. When the participant understood, they liked the options provided and thought it was clear.

· The user thought some of the language and terms used on the site were not very user friendly and a bit unclear.

For example, they suggested using the phrase ‘you will be told’ instead of the phrase ‘You will be made aware of’.

Image for post
Image for post

The word ‘Constraints’ also confused them in the context of planning applications. This could therefore benefit from being phrased in language that is more easily understood by the lay person language.

· The participant also highlighted to us that some part of the journey required more technical knowledge which some non-professional users may find difficult to complete themselves. For example, ‘Draw the site boundary’ would be difficult for the lay person to be confident in doing properly.

· The participant couldn’t see some of the highlighted areas on the page, and even after prompted, she couldn’t see the blue highlight on the landing page. This suggests we have more work to do in considering the accessibility of the design.

Overall though the participant did think the new design was clearer for users and was pleased that we were working on development of this tool.

Benefits case and metrics for Beta

Following on from our last blog and workshop 7, the project partners continued discussions in respect of how we can measure the impact of the new tool and capture how it differs in performance with existing systems.

Part of our MHCLG funding requirements it to demonstrate the benefits of the tool we’re developing. During this most recent sprint, Camden shared their customer surveyr and this was part of our agenda for workshop №8. This type of metric will capture the benefits to user experience. It will help us measure customer satisfaction pre-tool and post tool. Work on the survey included the partners working with the Miro board and using virtual post-it notes to capture suggested questions that

Image for post
Image for post
Miro Board: Partner workshop №8

This information will be collated into one spreadsheet and reviewed with the aim of agreeing the final questions / survey format at the next partner workshop on the 25th September.

In Alpha, we looked at time metrics — largely using Wycombe data but Lambeth also ran a mini-timesheeting validation time recording exercise for one month.

This allows us to get an idea of the average time taken to validate an application using the existing systems. We can then, once the private beta is up and running, taking the average time taken to validate similar application from the same scheme on tool. Comparing these two averages will provide an indication on how much time the tool saves in the validation process.

Image for post
Image for post

Other cost metrics assessed in Alpha included a calculation around applying an hourly rate to that time to provide a financial cost to the time — giving a rate on average per application — as well as time information. This is another metric we can usefully continue to collect in the Beta phase of the project.

Next Show and Tell — Friday 25th September 2020.

Written by

We’ve made it to Beta. An MHCLG funded project, led by Lambeth with five partners — Buckinghamshire, Camden, Lewisham, Northumberland and Southwark

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store