Projects

From HIT Lab Australia

Jump to: navigation, search

The HIT Lab AU has been working on many different and diverse projects across many disciplines. As these projects are initiated this page will be used to record and describe the work being undertaken in each project.

Contents

Food Product Innovation

(1) Oyster Project
Aquaculture is a significant coastal industry in Tasmania, which requires advanced decision supporting tools to ensure sustainability and growth under versatile climate (rising temperature, variable rainfall, increased severe events), social (changing demographics, other industries) and environmental (harmful algal blooms, nutrient and pollutant run-off) uncertainty.

Current environmental monitoring for, and by the aquaculture sector is uncoordinated and reporting is often retrospective; so of little value for some decision making by farm managers and regulators. Real-time sensing and networking of multiple sensor systems, integration into existing and new biogeochemical models, and correlating with key environmental, biological and production events will enable greater predictive and decision making capability by all stakeholders.


App1.jpgApp2.jpgApp3.jpgApp4.jpg
Fig. 1 Oyster Mobile App



Arglass1.jpgArglass2.jpgArglass3.jpg
Fig. 2 Oyster AR Glasses App



(2) Dairy Farm Project
The Australian dairy Industry is a key rural industry and represents the third largest agricultural sector, with a farm gate value of $4 billion and directly employing over 43,000 Australians. Increasing intensification of both the farm and the farming business has necessitated a greater need for managing information flows on-farm. As the majority of Australian dairy farms are predominantly pasture based, the integration and presentation of key biophysical, pasture and herd data into user mobile device will rapidly assist in enhancing the effectiveness and efficiency of on-farm management. As farms become more intensive and cooperate in the operation the ability to integrate and present historical, current and forecast data relating to the key performance indicators of the business is viewed as an important operational need.

By using sensor technology from the “Sense-t” network, this Dairy Farm App will assist dairy farm managers in making informed management decision regarding their pasture and feeding in an effective and efficient manner. This will lead to increase production and utilization of the feed base. In temperate regions of Australasia, feed base management is the key determinant of the economic viability of dairy enterprises.

Dairymobile1.jpgDairymobile2.jpg


Dairymobile3.jpgDairymobile4.jpg


Fig. 3 Dairy Mobile App



Dairyar1.jpgDairyar2.jpgDairyar3.jpg
Fig. 4 Dairy AR Glasses App



(3) Viticulture Project
Situational awareness mobile app (Alpha Trial) The Alpha Trial is the first full-scale trial of how all of the pieces of Sense-T fit together to deliver benefits for farmers. It will provide access to real-time data to help users make faster, more accurate business decisions. The Alpha Trial will take place at 20 sites across Tasmania, incorporating a range of agricultural business including vineyards, broad acre pasture, dairy, annuals, biannual, berries, floriculture and organics.

Situational awareness.jpg


Vitimobile1.jpgVitimobile2.jpgVitimobile3.jpgVitimobile4.jpg
Fig. 5 Viticulture Mobile App



Vitiar1.jpgVitiar2.jpgVitiar3.jpg


Vitiar4.jpgVitiar5.jpgVitiar6.jpg


Fig. 6 Viticulture AR Glasses App



(4) Smart Greenhouse Project
The greenhouse is a building or complex in which plants are grown. It allows for greater control over the growing environment of plants. Depending upon the technical specification of a greenhouse, key factors that may be controlled include temperature, levels of light and shade, irrigation, fertilizer application, and atmospheric humidity.

Intelligent systems in agriculture have become more and more increasingly important over the recent years. Lighting, temperature and energy management, control and regulation make greenhouse more efficient. This project is going to build an intelligent management system, which allows the user to remote control and monitor the greenhouse, even when the user is not in the greenhouse. The system should provide flexible functionality so that it may be expanded by adding other features.

The management system is shown below:

Greenhouse.jpg
Fig. 7 Smart Greenhouse




(5) AR Indoor Positioning System
HITLab AU will develop an AR location based system for farm managers. With such a system, farm managers will be able to access their farm information based upon GPS but also know how consumers consume their product using an indoor positioning system (such as iBeacon). Two front-end systems will be built for consumers and farm managers. A storyboard sketch is shown on the figure below.

Indoor fig1.jpg

Fig. 8 Mobile client concept design


The proposed prototype system consists of a number of iBeacon base stations, a mobile application and a respective cloud-based server. Shown in Fig. 9 is an overview of the proposed system.

Indoor fig2.jpg

Fig. 9 System framework




(6) DroneAR: Augmented Reality Supported Unmanned Aerial Vehicle (UAV) in Agriculture for Farmer Perspective
DroneAR, which is use for farmer perspective in agriculture by using unmanned aerial vehicle (UAV). Although there has been a lot of types about the use of UAVs within the agricultural sector including spraying, seeding, remote sensing, precision agriculture, frost mitigation and variable rate dispersal, AR technology will reduce farmer inputs and increase their yields. Our goal is to increase farm efficiency.

The proposed application is designed to instantly provide an eye-in-the-sky image of any type of crops and any size farms and monitor farm infrastructures, livestock feedlots and pastures. The high definition video camera transmits live streaming back to the base screen. A flight plan can be set either to scan an entire farm or to follow specific waypoints. The drone will auto land when low battery power is detected, and with a home function can be returned at any time. By using the Google map API, the points of interests with the latitude and longitude coordinates are gained. DroneAR then creates labels which are scaled so that the closer POIs have larger fonts and display them on the live streaming video from UAV.

Dronear.jpg
Fig. 10 DroneAR

Eco and Cultural Tourism

(1) TAS MOVE: The Processes of Applying Flat Design in an Efficiency Require Mobile Application
TAS MOVE is a mobile application specifically designed for tourists when they on the move, and focuses on their needs while traveling. It aims to help visitors to accurately, quickly, and easily collect the information they need in order to assist with destination planning and decision-making.

Tasmove.jpg

This application provides a user-friendly interface that allows visitors to easily filter and gain the information they need. Unlike some existing applications, like Google Maps, TasMove not only provides a map function and geographical information, but also gathers and provides information about restaurants, coffee shops, historical buildings, attractions, accommodation within farms, museums, galleries, feature shops, market places, and so on. Visitors can modify their schedule anytime during their trip. TasMove utilises “flat design” and a colour palette that helps visitors to recognize attractions, and distinguishes different situations easily and efficiently. The app also combines a taxi fare estimator system within the information page. Depending on the distance between the visitor’s current location and the destination, the system can estimate the taxi fare. This streamlines the process for visitors who want to travel by taxi.

Creative Arts and Service Design

(1) Effects of curves on network visualization
Node-link diagrams have been used to visualize abstract graph data for the purpose of knowledge discovery and knowledge sharing. Although links are often drawn as straight lines, curves has also been used in drawing graphs. Further, despite the wide use of curves for various purposes, the research on to what extent these curves serve their purposes or how effective curves help visualizations covey the embedded information to users is still a topic that requires further exploration.

In an attempt to answer this question, we initiated a project that aims to systematically investigate the pros and cons of curves, and their impact on human graph comprehension in terms of task performance and task execution behavior.

Curves.jpg



(2) Use of ambient and eco-visualization for promotion of environmental awareness
Eco-visualizations (EVs) are any kind of interactive devices or pictures made by media artists that are targeted at revealing energy use in order to promote sustainable behaviours or foster positive attitudes towards sustainable practices. Despite much effort in developing various kinds of EVs, their values are mainly appreciated from the artistic perspective. Relatively less attention has been paid to investigating whether and how they fulfill their purposes.

The proposed objectives of the project was to explore this space by 1) designing three different EVs, and 2) examining how these EVs mediate their effect on human energy consumption behaviour.

Flowers.jpg



(3) Making sense of biology data through artistic interactions
Increasingly larger and more complex data about people’s health and biology are being generated than ever before. Interactive visualization that takes advantage of powerful human visual perception system, combined with advances in IT and data science offers significant opportunities to generate new knowledge and improve medical practice. However, despite many innovative visualization techniques and appealing pictures have been proposed in the literature, only a few of them have been used in practice. It is widely acknowledged that this is mainly due to the lack of mechanism in them that motivates users to engage with the process of sense-making.

To improve the situation, this project draws on latest research results from the fields of visual arts, visualization, health and education and aims to design and evaluate artistic interaction methods. We define artistic interaction for the first time as a series of dynamic artistic visual representations of data for effectively communicating the transformation of data statuses. It is hypothesized that by introducing artistic interactions, users will be more engaged with the sense-making process, thus making the visualization more effective in conveying the embedded information to the end users, in comparison with traditional reality based interactions.

Usmap.jpg



(4) Visual Analytics for Massive Multivariate Networks
This project aims to create methods to visually analyse massive multivariate networks. The amount of network data available has exploded in recent years: software systems, social networks and biological systems have millions of nodes and billions of edges with multivariate attributes. Their size and complexity makes these data sets hard to exploit. More efficient ways to understand the data are needed. This project will design, implement and evaluate visualisation methods for massive multivariate network data sets. This research is expected to be used by Australian software development, biotechnology and security companies to exploit their data.

Network.jpg

ICT for Rural and Regional Health

(1) An innovative tele-assistance system to support education in clinical procedures

Access to sound expertise and guidance to perform clinical procedures is often lacking, especially in more rural and isolated areas. In the absence of direct supervision, practitioners and students can feel underprepared and lack the skills and confidence to perform many procedures. This project aims to improve the situation by making remote expertise and guidance more accessible.

In this project, we will trial the application of cutting-edge technology, a “wearable tele-assistance system” that will allow novices to undertake procedures with real-time audio and visual guidance provided by a supervisor at a distant location. This technology enables the instructor’s “helping hands” to be visually projected onto site (a patient’s wound for example) and allow the novice to shadow the instructor’s movements throughout the procedure (if necessary) or allow the instructor to “step-in” should an incorrect action be initiated.

Teleassist1.jpg




(2) Helping Hands: an innovative tele-assistance system for clinical skill development

The safe and correct performance of clinical procedures is a critical component of the skill set required by health care professionals. The development of competence in this area requires practice and repetition to improve eye-hand coordination and dexterity. Students are taught clinical (practical) skills in a laboratory under direct supervision. This supervision is resource intensive and time-limited. A student may therefore feel underprepared and lack confidence when they are asked to undertake a procedure whilst on placement at a hospital (and on entering the workforce).

This project aims to improve this situation by making remote expertise and guidance more accessible through the application of innovative Augmented Reality technology and wearable devices. The key feature that makes this technology unique is that it supports unmediated remote gestures by augmenting the object of interest with “helping hands”. This makes the users feel like they were working side by side although they are physically distributed, thus improving user experience and task performance.

Teleassist2.jpgTeleassist3.jpg




(3) Evaluating Inhabiting Visual Augmentation in a Telerehabilitation Context

This project describes a pilot study using a prototype telerehabilitation system (Ghostman). Ghostman is a visual augmentation system designed to allow a therapist and patient to inhabit each other’s viewpoint in an augmented real-world environment. This allows the therapist to deliver instruction and observe performance through the patient’s point of view. In a pilot study, we investigated the efficacy of Ghostman by using it to teach participants to use chopsticks. Participants were randomized to a single training session, receiving either Ghostman or face-to-face instructions by the same skilled instructor. Learning was assessed by measuring retention of skills at 24-hours and 7-days post-instruction. As hypothesized, there were no differences in reduction of error or time to completion between participants using Ghostman compared to those receiving face-to-face instruction. These initial results are promising and warrant further investigation.

Telerehab.jpg

Prof. Tom Furness and Dr. Winyu Chinthammit received UTAS internal exploration grant to develop and demonstrate a proof-of-concept wherein a physical therapist or healthcare professional is able to inhabit the body of a patient remotely in order to deliver movement instructions and therapy to the patient.

UPDATES: click here

Education Technology and Learning Science

(1) Using hand/finger gesture controlled technology to enhance the learning experience of chemistry students.

Developing a new application to build and interact with chemistry models will enhance the learning experience and enrich learning resources available for students. LeapMotion is an emerging interactive technology that maps where your fingers are in front of a computer will enable students to use natural gestures to interact with molecules. The integrated experience between the new technology and teaching materials will enhance the student’s visuospatial ability and thus will improve the overall learning outcome for the student. This project will also enhance an understanding of the development and implementation of a new technology into a curriculum aiding in the knowledge base so that other staff will be able to do this more efficiently. Future cohorts of students studying chemistry across the university will then use the new technology and associated teaching materials generated.

Gesture.jpg



(2) Super High Definition Visual Analytics System (TS-VAS system)



The TS-VAS system is an interactive visualisation system that is capable of rendering 3D contents at a very high display resolution (4K). The system is equipped with state-of-art interactive devices such as high precision head tracking system, fine hand/finger tracking system, RGBD body gesture tracking, Facial recognition, and Speech recognition. With these interactive capabilities, the system will allow users to interact with ultra high-resolution 3D content. No only with the visualisation and high-resolution display capability, the system will equip customized data analysis capability to suit different research purposes.

The TS-VAS system is a tool that can facilitate cross-discipline collaboration with other CIs in Arts, AMC, Health Sciences as well as projects with UTAS strategic development such as Asian Institute and Sense-T program and external organization such as Tourism Northern Tasmania and Launceston City Council. Within the Internet of Things theme, the TS-VAS system is the central communication space where other placed around Tasmania can be linked up to. This will make possible a variety of real-time visualisation applications such as exploration of Sense-T / climate change data, natural disaster management and clinical emergency simulation and inside 3D digital art.

Tsvas.jpg



(3) Encouraging Understanding of Natural Resource Using Emergent Technology


The hypothesis investigated was that a tangible multi-touch table interface encouraged understanding of natural resource issues using map-based constructivist learning tasks. The natural resource issue of Preparing for bushfire was chosen to test this hypothesis. It addressed key problems such as the low inclination by residents to prepare for bushfire. The system design and content evolved from participatory involvement of three bushfire community groups.

Table.jpg



(4) 3D Mobile Interactions with Public Displays


The use and prevalence of public displays has grown over the past decade, however achieving user engagement is one of the main difficulties of public display interactions. Collaboration among viewers enables users to engage more with public displays, but creating an environment where collaboration can happen is difficult. The reasons for this are various, for example the openness of the environment, the different types of users around the display and the tools users have available to interact with them. A user interface that solves some of these problems can engage user with public display more by creating new ways to collaborate around them. In this paper we evaluate a 3D mobile interaction that creates a collaborative ambient by letting users position 3D content outside the display using natural and humane skill. In order to better understand the relationship between giving each user a private and unique view of the content and increasing collaboration around public displays we conducted one usability test. In total 40 participants, divided in groups of two, solved a real world scenario using either the proposed 3D mobile interaction or a traditional public display. In this paper we present the results of this usability test.

Publicdisplay.jpg



(5) A Hands-On, Mobile Approach to Collaborative Exploration and Discussion of Virtual Museum Artifacts


Many of the objects and artefacts that make up museum collections suffer from problems of limited access, usually caused either by their fragility, the conditions they are displayed under, or their physical location. This is problematic in museum learning contexts, where these limits make touching, handling or passing a given object around during collaborative discussions impossible. Interacting with 3D, virtual representations of museum artefacts is a potential solution, but such experiences typically lack the participatory, tactile qualities that make artefacts so engaging.

In this work, we describe the design of a tablet-based interactive system for collaborative exploration and discussion of virtual museum artefacts. Our contextual evaluation of this system provides evidence that hands-on, reality-based interaction with a tablet interface offers a significantly more engaging way to collaboratively explore virtual content than more traditional, desktop-based interaction styles, and provides an experience much more akin to that of handling real, physical objects.

Artifact.jpg




(6) Hardware and software optimization of a night fighting and situational awareness system


The project will identify potential solutions to hardware and software issues associated with a night fighting and situational awareness system being developed by Rockwell Collins Australia Pty Ltd (RCA). RCA had identified three issues of importance prior to the workshop including: optimisation of hardware/software for real-time video display to operator; optimisation of graphical user interface; and optimisation of hardware enclosure.

Situationalawareness.jpg



(7) Effective Use of Mobile, Wearable, and Ubiquitous Displays in Teaching and Learning Environments


Multiple-display technology has been successfully used in many different applications, ranging from command and control, vehicular and CAD design, scientific visualisation, education and training, immersive applications, and public information displays. There are many reasons why their use is successful in these contexts. They provide increased situational awareness; collaborative decision-making capability; capability to visualise and manage large amounts of data simultaneously; the ability to interact with data at varying levels of detail and on different (and even orthogonal) dimensions; and the capability for enhanced learning environments for classrooms. Yet their use in education for enhancing methods of teaching and learning is still in its infancy, and many challenges are yet to be overcome in these environments.

This project lies at the intersect between ICT and education, and combines multimodal learning and cognitive load theory to investigate techniques in which mobile, wearable, and ubiquitous display technologies can be successfully employed to enhance teaching and learning practices.

Displays.jpg


(8) Higher-order Learning Tools for Teaching and Learning


If universities are to survive, they must look to the quality and relevance of their teaching activities in ways that they never have before. Concept maps are a graphical tool for organizing and representing knowledge. Strengths of the concept mapping technique are that it requires students to organize knowledge in a new way, articulate relationships between concepts, and promote the integration of new knowledge with previously learned knowledge. When compared with more traditional techniques such as reading, attending lectures, and note taking, concept mapping has been shown to have many benefits.

In this project, we investigate the benefits that the integration of higher-order learning tools like concept maps can provide to students for organizing their thoughts and reinforcing their learnings of ICT units. In this project, prototype systems are designed and created, and HCI studies are undertaken.



(9) Personalizing Location-Based Services through Neighbourhood
“Hyperlocal computing'” is an emerging field of research that refers to location-based applications and services that target very specific local geographical areas (e.g. neighbourhoods) and the people who live or spend a substantial amount of time within such areas. In contrast to traditional LBSes that provide information and functionality to users based on their current and changing geographical location, users of hyperlocal applications are typically: very familiar with their geographical surroundings; very familiar with at least some of the people in the area; spend a substantial amount of time within the area; and are strongly bound to the area. Due to this specialised feature-set of hyperlocal services, they are able to provide very high levels of personalisation and the ability to reach people in a much more targeted manner than most traditional LBSes, by delivering content that is relevant not just to the individual, but to the individual in the specific geographical area in which he/she resides.

With a focus on the home neighbourhood, this project investigates an experimental platform that supports a variety of LBSes to understand how users define `neighbourhood' as a geographical construct for use by online LBSes.

Neighbourhood.jpg

iFiction

Dr. Angela Thomas from Faculty of Education and Dr. Winyu Chinthammit were successful in their REGS 2011 grant application, titled " Augmented reality and new dimensions of experience with literature and multimodal authoring: mobile technology, new media and literary creativity in English teaching" or codename "iFiction". The project will develop an indicative prototype story-authoring mobile application designed to use Augmented Reality to enhance and transform children’s interactive, participatory and innovative experiences with literature.


Magic Map

Magic Map is a collaborative project between HITLabAU and CSIRO Tas ICT which aims to improve the way we consume environmental sensor data through the use of augmented reality. When using a physical map in conjunction with a webcam and an application developed at HITLabAU, the user is able to see a 3D overlay of data collected from CSIRO environmental sensors superimposed upon the map. The user is then able to modify the type of data displayed, view data from different time periods or superimpose a 3D mountainous terrain overlay, by manipulating physical dials they place around the map.

Magic Map Tasmania is the first of a series of collaborative works between HITLabAU and CSIRO Tas ICT. Please visit the following URLs for details: http://www.hitlab.utas.edu.au/wiki/MagicMap