Description

The WaterBug is an autonomous 'Roomba-like' drone that waters your garden for you that you can control and track with a companion application on your smartphone.

This process journal details the steps me and my team took to create and refine the interface for the companion application from early sketches and ideation, to user-testing.

TagsResourceProjectDate25-08-2018

Problem space


It’s 2018 and autonomous vehicles are becoming prevalent, from Google and Tesla’s cars, to DJI’s drones, to China’s proposed automated transport system. As such, our project brief was designed to push us into this emerging field and consider users might control these vehicles, how we might design the interfaces between users and AI.

The idea was to conceptualise and prove the need for an autonomous product, then design and develop the user interface for it. Since autonomous vehicles is still a very broad field, we split into teams and chose a field to focus our research. We chose urban agriculture; that is any farming or gardening in inner city environments. We felt that agriculture in itself was an easy target for automation and urban agriculture was both more accessible than the more commonly considered rural agriculture and there would be more room for innovation as it's a more overlooked space.

Methodology


Research Methodologies

Since our team consisted of four university students who had no gardens of their own, our research was cut out for us. To tackle the area we split the sector up into 4 main stakeholders that we thought would offer the most insight for our project; Council, Developers, Architects/Developers and Decentralised farmers. I chose to research the council's role because it seemed they had the most direct and immediate impact on day-to-day urban agriculture

Foundational research

I built the basis of my understanding Urban Agriculture from online resources such as council documents, media reports and blog posts. I scanned for any pre-existing problem areas and used my new knowledge to inform the following stages of research

Academic Research

I accessed the University Library and online research databases such as ABARES, AgEcon, and JSTOR.  I used these academic resources both to validate my previous findings and identify any other issues in the urban agriculture space

Semi-Structured Interview

I contacted a City of Sydney Employee to gain an understanding of the urban agriculture space from the council's perspective and how it might contrast to the corporate perspective that my partner JX was researching.

Contextual Observations

Since the employee I spoke with was from the head office in the CBD I wanted to get an on the ground understanding of the council's role in  urban agriculture. I attended a workshop at the City of Sydney's community farm in Tempe and took notes on the plants, layout, participants and organisers

Contextual Interview

After recording my observations at the community farm I interviewed the attendees and organisers of the city agriculture workshop. I conducted the interview so I could understand what made people go there, how effective the workshops were and find out if there were any pre-existing issues in the space

Personas

After conducting these rounds of research I distilled my findings into three personas that reflected three potential users whose needs we would have to meet if we are to develop a product or service in the urban agriculture sector. You can view these personas in the image carousel below the research findings section.

Findings


Whilst I approached this research with an open mind it was only after that I realised how rigid and simplistic my understanding of urban agriculture, prior to conducting the research I had a very operational view of the sector; people have plants and crops in their garden to get food, look pretty and maybe even make money. After investigating further I found there were benefits to urban agriculture, the scope and breadth of which I hadn’t imagined; urban agriculture is a multi-million dollar industry that councils and governments are severely undervaluing, community gardens are a pillar of society, gardens provide people with invaluable fulfillment, and while people love to garden life has an awful habit of getting in the way.

During the foundational research stage, I found reviewing documents from The Yarra Council in Melbourne concerning their urban agriculture strategy. The key takeaway from the report was that community gardens are viewed as a ‘holistic solution to an array of social issues in the community’. Perhaps it was just that urban agriculture isn’t typically reported on and backyard success is appealing subject matter, but during my online research I found a lot of reports that urban agriculture was a profitable niche good and that produce from backyard farms is of a higher quality than mass market alternatives thus deserving of a higher premium.

The academic research elaborated further on the financial viability of urban agriculture and also made me realise the huge role government and councils play in fostering urban farms and have been playing for decades, dating back to the ‘Grow your own’ campaign launched by Commonwealth Department of Commerce and Agriculture back in 1943. More startlingly a longitudinal study by Dr James, S and Professor O’Neil on behalf of the NSW Department of Planning looked at all the agriculture projects in Sydney over 20 years and consistently found, year after year, the scale and value of Sydney’s agriculture was severely undervalued by the ABS, with underestimations by 10 - 100 million.
Following these rounds of formal research I relaxed things a bit and got coffee with a City of Sydney employee in the cultural strategy department. He hammered home the social benefits of community gardens, particularly for those who are elderly or disadvantaged. He was also critical of industry developments concerning urban agriculture, skeptical of their motivations, explaining that Central Towers (a development known for its plants that hang along the side and community garden, a development that JX was also investigating for his sector) had actually implemented their iconic vertical garden to cover up a poorly designed exterior that doesn’t meet ‘best practice’ and without which would be unacceptable.

My contextual observations at the City of Sydney community garden workshop in Tempe was that there clearly wasn’t any issue with scale or funding, which went against my expectations. However I did notice that everything was done manually, carrying, planting and watering, as such there could definitely be an opportunity there for automation.
Similarly the contextual interviews wasn’t quite what I expected. Different demographics had different motivations for being into urban agriculture; the older participants were there more for the social element whilst the younger participants were there to learn and engage with farming from a sustainability standpoint.

Below I've generated three personas that encapsulate user needs drawn from these findings.





After we developed our personas and problem statements we set to work creating concepts, we each came up with 2-3 potential solutions that incorporated autonomous vehicles. In the carousel below you can see the two that I came up with:

The first sketch is a concept is called Garden Guardian and is somewhat based on the aged care robots that are starting to be implemented in Japan. The idea was mainly taken from problem statements found in my field of research, exploring the holistic benefits of gardening, particularly for those who are elderly, it gives them a purpose to get out of bed and keeps their body moving. The idea behind the robot was to assist elderly people with their gardening without taking the task away from them entirely, to help when needed and be passive whenever possible. Additionally, the robot keeps an eye on the older individual and ensures that if any harm comes to them, to notify the relevant authorities immediately. Lastly the Garden Guardian was conceptualised to provide companionship when gardening, of course nothing can replace the companionship of a real person but as was found in Japan, unfortunately there are many cases in which it’s a robot or nothing and a robot is far better than nothing. However this concept was ultimately abandoned as although we’re not building the physical device we found the technology to achieve our desired outcome is far from developed at the cutting edge, let alone accessible for this sort of project. Perhaps by the time we’re retired companionship robots will be commonplace but at this stage the idea was just too Sci-Fi for the scope of our project.

The second is WaterBug (the concept we ended up running with) an autonomous consumer drone, similar to the iconic ‘iRobot’ vacuum cleaner drone, only this drone resembles more closely an R/C toy combined with a ‘Super Soaker’. This little all terrain vehicle is designed to navigate your backyard autonomously and water each plant according to its specific needs before refilling and recharging at its docking station. The main problem it was trying to solve was based from prolific user struggle with garden maintenance; many people loved having a garden and even growing vegetables but they often abandoned their efforts due to time restraints from the rigorous upkeep demands of each plant. Below you can see this sketched out in a storyboard depicting an office worker with who has planted a lovely garden, only to have it dry out because he doesn’t have the time to maintain it.



Following this round of ideation, we were ready to move onto user testing so that we could begin the iteration and reduction phases of our design process. WaterBug was selected along with Jason's Roaming Garden and LifeCycle as well as Abhinav's Air Agriculture.                                                                                                                            

Early Sketches


Now that we had our 4 concepts it was time to flesh them out and test them out. We each developed hand drawn sketches that were then fed into prototyping tools; POP/Marvel, Invision and framer. We used these very rough sketches to test out the user flow of our interfaces with test subjects, using ‘Think aloud’, observations and post test interviews to ascertain our results. Below you will see two carousels, one displaying the early sketches I developed for WaterBug’s onboarding and installation screens, early sketches for system functionality by JX and the next shows us testing with users in the field. The test subjects we had chosen were young attendees of the Jubilee Park Community garden space that were aged 19 - 31.

Waterbug Sketches

Early User Testing

Findings


Overall user reception to the application and its user flow was positive. However it was in this initial round of testing where we first encountered the dichotomy between different groups of our potential users. Since this product is supposed to be a mass-market consumer electronics device we had to balance the needs of the more technologically savvy ‘super-user’ with the needs of the less adventurous common user. This meant numerous iterations of calibration, a ‘two steps forward, one step back’ approach, experimenting with more technical or accessible designs. I’ll be elaborating further on this in the wireframe stage.



Wireframes

This catalogues 3.5 iterations of the user interface, detailing shifts closer towards a balance between usability and capability.  The first few were designed by Abhinav Bose and the rest by myself.

Results

The idea behind this round of testing and evaluation was understand whether or not this iteration presents an interface that meets the best of both worlds; to see how a range of users respond to our interface and whether or not we’re meeting both parties specific needs.  As previously mentioned, after our user research we decided that a mass market, consumer electronics device would be the best user group for our product. We thought it would be easy to design a one-size-fits-all interface. However across our various rounds of iterations and testing we battled with meeting the expectations of our users. We kept finding the designs were often only good at accomodating to one specific user group; either a technical subset of users who wish to glean the maximum functionality of our product, or the passive user who wants to use the autonomous service to find a solution wherein they can switch their brains off. To resolve this dichotomy we developed  a ‘soft induction’ approach that familarises users quickly and easily to the setup and functionality of our device and interface then we allow users a choice to either let the drone function autonomously at default settings or to let power users who proved to be more inquisitive anyway, to navigate through the UI to find the additional functionality.

Process


After we broke down exactly what we needed for user flow from our wireframes, I designed each of these in Adobe Photoshop. I incorporated a dark theme so that it is less battery dependent on AMOLED screens.  I opted for a futuristic UI to complement the futuristic nature of an autonomous gardening drone. 

Screens

I developed these base screens and the rest of the UI was fleshed out in the prototyping phase. I opted to use Adobe Photoshop after initally attempting with Sketch and Figma because I found it gave me more flexibility and creative freedom in fine tuning designing individual assets to create a more visually impressive design  and because I knew we were developing the prototype in Adobe XD so transferring the files over would be a breeze.

Getting it working

Now that we had the validated user flow, UI structure and aesthetic we were ready to implement our designs into prototyping software in order to create a clickable 'dummy' app that gives users the experience of using our app before we endeavor to actually develop it. We used Adobe XD to design and animate our project. 

If you'd like to download the prototype, don't hesitate to contact me!