All posts by Heather Catchpole

Fourth industrial revolution lifts social good

Featured image above: The fourth industrial revolution will bring 75 billion connected devices to the world by 2020 Credit: World Economic Forum / Pierre Abensur

The Fourth Industrial Revolution will arguably become the most disruptive and transformative shift in history, and it’s happening at a rapid pace. Experts from all over the world are discussing how technologies such as artificial intelligence, 3D printing, robotics and biotechnology will have a transformative impact on nearly every industry – from manufacturing and retail to entertainment to healthcare.

But one of the biggest areas of transformation will happen within the social sector. Nonprofits, NGOs and education institutions have a tremendous opportunity to leverage new technologies to scale up their impact and ultimately achieve their critical missions.

The Fourth Industrial Revolution offers huge opportunities to transform social good organizations for the better. Here are five key ways nonprofits, NGOs and education institutions can benefit:

1. Connect to anyone, from anywhere, on any device

The digital era has allowed more people from more places around the globe to become connected. And for the first time, people in remote places have access to other people, resources and aid through the connected devices. There’s a huge opportunity for nonprofits and education institutions to reach more people than ever before and connect them with their cause. Today, nonprofits and education organisations can connect with their donors, volunteers, students and constituents in real-time from anywhere. At schools, for example, a student advisor can send a text message or push notification the minute they see a student falling behind. Nonprofits can instantly reach their community of donors and volunteers to help with urgent matters that may mean the difference between life and death.

2. Scale like never before

Because we’re more connected than ever before, social good organisations can also scale like never before. Historically, a lack of resources and funding have plagued the social sector, but technology can help small organisations make a big impact. Now, it doesn’t matter whether an organisation has 8 or 8,000 employees, the amount of people that can be reached is limitless. Populations that were previously unreachable can now be tapped and connected with particular causes without having to drastically increase overhead costs. Individuals with a passion who may have previously felt helpless will be able to start international movements with minimal resources.

3. Organise communities and engage more deeply

With the arrival of the fourth industrial revolution, organisations can also start to organise and understand these communities better than ever before, resulting in deeper engagement. A nonprofit, for example, can organise its community based on region, specific causes, engagement level and more, and communicate with these groups or individuals in a way that’s highly personalised. According to the recently released Connected Nonprofit Report, 65% of donors would give more money if they felt their nonprofits knew their personal preferences—and 75% of volunteers would give more time. With deeper engagement, these organisations will start to see increases in donations and volunteer time, which directly impacts their mission. For schools and education organisations, they can create a curriculum and course tools around specific learning styles and preferences in order to engage them more deeply and improve their education experience.

4. Predict outcomes

Not only is everyone becoming connected, but everything is becoming connected. In fact, there are expected to be up to 75 billion connected devices in the world by 2020 that will generate trillions of interactions. Advances in artificial intelligence and deep learning are helping make sense of this massive amount of data to deliver actionable insights to businesses and organisations alike. Artificial Intelligence could perhaps be the biggest disrupter of all. For the social sector, that means services can recognise patterns within a community or particular cause and predict future outcomes. For example, education institutions can recognise patterns within a student’s journey, so teachers and advisors can proactively reach out to students who may be in danger of failing or dropping out before it happens. A nonprofit focused on the humanitarian crisis, could identify the specific location and number of refugees coming into different countries, and preemptively send the appropriate level of aid and supplies.

5. Measure impact

Today, 90% of donors think it’s important to understand how their money is impacting the organisations they support, but more than half of donors don’t know how their money is being used, according to the Connected Nonprofit Report. As we look toward the future, the measure of nonprofit success will not be the amount of dollars raised—it will be the impact made on the communities they serve. Historically, impact has not been quantifiable, but with advances in data and analytics, social good organisations can measure how they are performing. This will be crucial to maintain and attract donors and volunteers who help make these organisations possible.

Social good in the fourth industrial revolution

Technology can create, inform and drive global change. The social sector can use it to find and connect with more people who need their services, understand their communities on a deeper level, predict outcomes to make them better prepared and possibly prevent certain situations, and even measure the impact they’re making against their cause.

But it’s up to social good organisations to take advantage of these opportunities—and quickly.

– Rob Acker, CEO, Salesforce

This article on the fourth industrial revolution was first published by the World Economic Forum. Read the original article here.

Grapevine app helps growers detect stress

A new grapevine app that helps grapegrowers measure the water status of their vines is being trialled across Australia.

The portable viticultural tool has the potential to help grapegrowers make improved water management decisions for their vineyards.

Grapegrowers use a thermal camera attached to their smartphone to take images of the canopy of the grapevine. The image is analysed by the grapevine app, which calculates the vine water status.

The technology is being tested by 15 vineyards in South Australia, Victoria, New South Wales and Tasmania for the rest of the growing season.

The Wine Australia-funded project is being led by the South Australian Research and Development Institute (SARDI), a division of Primary Industries and Regions SA, in close collaboration with The University of New South Wales (UNSW).

“Water and associated pumping costs can be a significant component of the production costs for grapegrowers,” says Dr Kathy Ophel-Keller, Acting Executive Director of SARDI.

“Uncontrolled water stress has the potential to reduce the yield and quality of grapes and the resulting wine, which in turn reduces the return to growers.

“The management of vine water status is a key tool for grapegrowers to regulate yield and optimise fruit quality and style.

“This new app offers grapegrowers instant feedback on the water status of their vines, and provides them with the flexibility to assess multiple blocks or sections of blocks, and to make irrigation decisions in real time.”

Dr Liz Waters, General Manager of Research, Development and Extension at Wine Australia says, “Irrigating effectively and efficiently helps to optimise vineyard production to produce high-quality winegrapes for fine Australian wines.”

“Through many years of extensive research, methods have been developed to assess grapevine water status. This new app provides a portable solution to measure water status quickly and easily in the vineyard.

“The app allows growers to make informed irrigation decisions that support the production of high-quality fruit grown to specification.”

The 18 month project aimed to evaluate a range of smartphone-based sensing systems to develop a cheap, easy-to-use vine water status monitoring app, to assist growers to manage irrigation.

Initial trial results found the thermal camera was the easiest to use and provided accurate information.

The grapevine app was developed by UNSW and the tool is now being tested by a variety of wineries, with their feedback helping to inform the further development of the innovative technology.

The aim is to release the final version of the grapevine app later in 2017.

This information was first shared by Wine Australia on 24 January 2017. Read the original article here.

Biosensors to shield against deadly epidemics

Featured image above: Macdonald (centre) with colleagues from the Programa de Estudio y Control de Enfermedades Tropicales (PECET) at the Universidad de Antioquia, Colombia

In April 2016, only two months after the World Health Organisation officially declared the Zika virus outbreak a Public Health Emergency of International Concern, a team of Australian experts in tropical medicine and mosquito-transmitted diseases travelled to Brazil and Colombia. 

Among the delegation, arranged by the Australian Trade and Investment Commission, was Associate Professor Joanne Macdonald from the University of the Sunshine Coast (USC) in Queensland. The molecular engineer, who also holds an appointment at Columbia University in New York City, has been developing point-of-care biosensors, similar to take-home pregnancy tests, to diagnose diseases. Importantly, these devices can rapidly detect the genomes of multiple diseases simultaneously, keeping costs down for diagnostic testing in areas where lots of diseases are co-occurring.  

With A$130,000 from the Bill and Melinda Gates Foundation, she and colleagues in Queensland have been working on a proof-of-concept to test mosquitoes for malaria, dengue and chikungunya. The test will also detect the bacterium Wolbachia. When introduced into Aedes aegypti mosquitoes, this potential control agent has been found to prevent viruses, including dengue and Zika, from being transmitted to people. 

Improving diagnosis during epidemics with biosensors

Biosensors

A/Prof Joanne Macdonald (far right) and colleagues observing vaccine and antidote production facilities at the Institute of Butantan, Sao Paulo (Credit: A/Prof Joanne Macdonald)

In Rio de Janeiro, Macdonald heard from local researchers how diagnostic testing labs were overwhelmed by the Zika virus epidemic. Clinics were only testing pregnant women, she was told, and results were taking up to two weeks to be returned. Furthermore, labs were having difficulty distinguishing between Zika and dengue, which are closely related, she says. 

In this environment, Macdonald’s biosensors could be a game-changer. Apart from reagent substances,  which trigger chemical reactions that ‘amplify’ DNA to detectable levels, the tests only require the most basic of lab equipment: a heating block and centrifuge (a piece of laboratory equipment, driven by a motor that spins liquid samples at high speed). This means tests can be easily performed in a doctor’s clinic or hospital with results returned inside an hour. 

“The scientists in Colombia and Brazil wanted the technology right then and there because there was such a dire need with the Zika outbreak,” she says. 

Since the trip, Macdonald has begun working on a test to specifically detect the genetic signature of the Zika virus, eliminating the potential for inconclusive results. Having already developed tests to detect Ebola, Japanese encephalitis, West Nile virus, and Hendra virus, which has killed nearly 100 horses in Australia over the last 23 years, Macdonald is confident it’s within reach.   

In a world where deadly disease vectors are increasingly mobile thanks to global transportation networks, Macdonald’s biosensors could become an important line of defence for future epidemics.  

“If we can provide solutions that allow testing to be done at the point-of-care, rather than in a central lab, that would be a big help,” Macdonald says. 

Macdonald has founded a startup called BioCifer to hold the intellectual property rights and commercialise the various technologies, and is currently working with USC to access the relevant intellectual property. With keen investors already in place, she’s hopeful a diagnostic product – initially for use in veterinary clinics and for research-only purposes – could be just two years away.   

Rapid detection vital to saving lives

Reproducing the detection sensitivity of state-of-the-art labs in a cost-effective, portable device is the ultimate goal of Macdonald’s research, and though it may be a decade away, she is making headway. In December 2015, she and her then PhD student Jia Li reported a world-first milestone in the journal Lab on a Chip, published by the Royal Society of Chemistry. 

They had developed a handheld, pregnancy test-style biosensor, which could detect up to seven different analytes, or theoretical diseases. What’s even more innovative is how the device notifies the end-user of the result: if DNA from a certain disease is detected it will light-up patterns of corresponding molecules or dots, like pixels on a computer screen. 

Inspired by the seven segment displays on digital watches, the dots are arranged to resemble the numbers 0 through 9. It’s the first time a numeric display like this has ever been demonstrated on a paper-based biosensor, known as a lateral flow device, and amazingly, it requires no external power source.

The biosensor “is powered entirely by molecules,” says Macdonald. “We are borrowing from computing, but using molecules instead of computer bits.” 

Programmed molecules play strategy games and make autonomous decisions

In 2006, while at Columbia University full-time, Macdonald and her colleagues built a computer out of DNA molecules. They programmed the DNA, modifying it to respond to stimulus, in order to play the strategy game tic-tac-toe interactively against a human. 

In the future, programmed molecules could be used to develop biological machines that operate inside the body, releasing drugs or insulin autonomously, on demand – something her US-based colleagues are working toward. Macdonald, is harnessing the capability of this technology to more rapidly detect deadly diseases. 

By embedding computing principles in molecules “we can decide whether they will turn on or off depending on the presence of other molecules around them,” she says. “So it’s like a chemical reaction based on logic, the molecules can make decisions on their own without any external inputs. And we pre-program them to do this.” This is how the dots in the biosensor know to light up. 

Biosensors

Macdonald inside a laboratory at the Instituto Colombiano de Medicina Tropical, Medellin, Colombia (Colombian Tropical Medicine Institute)(Credit: A/Prof Joanne Macdonald)

Catching the microbiology bug

A rare illness in high school called coxsackievirus, which affected Macdonald’s heart muscles and prevented her from participating in sport, helped spur a lifelong fascination with disease. After she recovered, her interest blossomed at the University of Queensland. While there she majored in biochemistry and microbiology, and later completed a PhD investigating the West Nile virus under the supervision of immunoassay expert Professor Roy A. Hall, who she is still collaborating with.

Macdonald went on to spend 10 years at Columbia University, first in the lab of  Professor Ian W. Lipkin, an epidemiologist who was the scientific adviser for the Hollywood blockbuster Contagion, and then working with two “humongous scientific minds” in Professors Donald W. Landry and Milan N. Stojanovic. Under their guidance she not only programmed DNA molecules to play tic-tac-toe, but also helped develop a drug that inactivates cocaine, which is now being trialled as a treatment for overdoses. 

Back in Australia since 2012 and focused primarily on rapid disease detection, Macdonald is thinking about the next big question as point-of-care and biosensor technologies advance: “Can we actually predict epidemics before they start?” 

In the future, she wants her biosensors to effectively act as shields, used pre-emptively by aid agencies and community members to screen their surroundings, including potential hosts of infectious diseases such as bats, monkeys and mosquitoes, before outbreaks occur. She hopes it might empower communities, enabling them to take precautions before they get sick, and ultimately save lives. 

– Myles Gough

This article on biosensors was first published by Australia Unlimited on 19 January 2017. Read the original article here.

Smart needle uses IOT in brain surgery

The smart needle was developed by researchers at the University of Adelaide in South Australia and uses a tiny camera to identify at-risk blood vessels.

The probe, which is the size of a human hair, uses an infrared light to look through the brain.

It then uses the Internet of Things to send the information to a computer in real-time and alerts doctors of any abnormalities.

The project was a collaboration with the University of Western Australia and Sir Charles Gairdner Hospital where a six-month pilot trial of the smart needle was run.

Research leader and Chair of the University of Adelaide’s Centre of Excellence for Nanoscale BioPhotonics Robert McLaughlin says researchers are also looking at other surgery applications for the device including minimally invasive surgery.

He says surgeons previously relied on scans taken prior to surgery to avoid hitting blood vessels but the smart needle is a more accurate method that highlighted their locations in real-time.

“There are about 256,000 cases of brain cancer a year and about 2.3 per cent of the time you can make a significant impact that could end in a stroke or death,” he says.

“This (smart needle) would help that … it works sort of like an ultrasound but with light instead.

“It also has smart software that takes the picture, analyses it and it can determine if what it is seeing is a blood vessel or tissue.”

Smart Needle

Professor Robert McLaughlin (right) with the smart needle.

Professor McLaughlin says the smart needle has potential to be used in other surgical procedures. 

The trial at the Sir Charles Gairdner Hospital involved 12 patients who were undergoing craniotomies.

The needle with a 200-micron wide camera was successfully able to identify blood vessels during the surgery.

Professor Christopher Lind, who led the trial, says having a needle that could see blood vessels as surgeons proceeded through the brain is a medical breakthrough.

“It will open the way for safer surgery, allowing us to do things we’ve not been able to do before,” he says.

The smart needle will be ready for formal clinical trials in 2018.

Professor McLaughlin says he hopes manufacturing of the smart needle will begin within five years.

The project was partially funded by the Australian Research Council, the National Health and Medical Research Council and the South Australian Government.

The Australia Government has committed $23 million until 2021 to encourage vital research discoveries through the Australian Research Council Centre of Excellence for Nanoscale BioPhotonics.

– Caleb Radford

This article was first published by The Lead on 20 January 2017. Read the original article here.

Business confidence surges in Australia

Small and medium business confidence is now sitting at its highest level since March 2010, following an eight point rise to +46 on a net basis, according to the latest Sensis Business Index (SBI) survey.

Sensis Chief Executive Officer, John Allan says, “Businesses closed out 2016 on a high, with confidence up seven points for the year and expectations that 2017 will be even stronger. While business owners again felt optimistic about their own specific business strengths, it was the improvement in the perceptions of the economy which really drove confidence higher this quarter.”

The Index, which reflects the views of 1,000 small and medium businesses (SMBs) from across Australia, revealed that more than four times as many SMBs (61%) are now feeling confident, as those who are worried (15%).

There were business confidence gains everywhere except for Tasmania – down 14 points to +38. NSW is again the most confident location on +54, while the Northern Territory is again on the bottom, despite a seven point rise to +16.

“The national improvement in confidence was primarily driven by businesses in the east coast states. The story in Western Australia and the Northern Territory was very different with businesses there still adjusting to the conditions in the resource sector, although the situation has improved somewhat this quarter,” says Allan.

https://www.youtube.com/watch?v=zMqFjrV3suQ

Perceptions of the current state of the economy have moved into positive territory, jumping 11 points to +3, which is the best result since December 2013.

“Just as the stock markets shrugged off global uncertainty to finish 2016 on a high, it was also the case for Australia’s small and medium businesses. Perceptions of the economy are now in positive territory for both the short and long term projections and this points to a strong year ahead,” says Allan.

The Federal Government’s approval rating did not move this quarter, remaining on +2, although it finished the year down five points compared to the same period in 2015.

“The number of businesses worried about excessive bureaucracy and red tape was up this survey and these are the key issues the Government needs to address if it is to win over more business owners this year,” says Allan.

Business confidence in the policies of state and territory governments was the same or better everywhere except for the ACT. Despite this only the Tasmanian, NT and NSW Governments have a positive rating, with the Queensland and South Australian Governments still lagging well behind.

At an industry level the biggest improvement was in Finance and Insurance, up 31 points and now leading the other sectors thanks to strong sales results. The biggest concern was in Retail Trade, down 23 points due to poor sales, and cost and competitive pressures.

In the last survey of each year the Index also looks at expectations for the year ahead. Businesses expect all of the key indicators – sales, employment, wages, prices and profitability – to remain positive, with strong results anticipated for sales and profitability in particular.

“The capital expenditure result was up nine points this year and now sits in positive territory, with businesses also reporting easier access to finance this quarter. These conditions should help foster growth in jobs and the economy in 2017,” says Allan.

“The lower Australian dollar also appears to have had a positive impact on exports in 2016, with the number of businesses exporting goods or services rising four points to 15 percent.”

Business confidence among SMBs in the capital cities rose 14 points to +49, while regional business confidence fell two points to +41. The 16 point reversal means metropolitan SMBs are now more confident.

“Businesses in Sydney, Melbourne and Perth overtook their regional counterparts this quarter and are now more confident. Nationally, business owners in metropolitan areas are more optimistic about the economy, which is the key factor making them more confident than their regional counterparts,” says Allan.

Small and medium businesses comprise 99% of all businesses operating in Australia.

This article was first shared by Sensis on 18 January 2017. Read the original article here.

Parents to thwart deadly beriberi with fish sauce

A joint study by the South Australian Health and Medical Research Institute and the University of Adelaide found that introducing fish sauce fortified with the thiamine to the Cambodian diet provided enough nutrition to prevent beriberi disease, which is a leading cause of infant death in the country.

The study involved a trial in Cambodia led by the South Australian researchers where varying levels of thiamine (vitamin B1) was added to fish sauce products during the manufacturing process.

beriberi

Breastfeeding mothers and children who ate the fish sauce were then tested to confirm adequate levels of thiamine was present in their blood to prevent the disease.

Beriberi is caused by thiamine deficiency and in infant cases can quickly progress from mild symptoms such as vomiting and diarrhoea to heart failure.

With the findings published in the Journal of Paediatrics, Principal Nutritionist and Affiliate Professor at SAHMRI Tim Green says the next step is to lobby for funds to expand the trial in a bid to convince the Cambodian government of the merits of thiamine fortification.

“We’ve done this relatively large randomised controlled trial, but we provided the fish sauce in this case,” he says.

“Our next step is to scale up – to get Cambodian government or Cambodian industry involved and show that it works with 100,000 or 200,000 people.

“And if we can show that works, we can provide evidence to the government and they can also mandate the addition of thiamine to fish sauce.”

While fish sauce has no nutritional advantage over other foods trialled in the study, it was selected because of its near ubiquitous use in Cambodian culture.

beriberi

Fish sauce is produced in centralised locations, making it easier for government and industry to control, and is already fortified with iron

Fortification is used in many countries around the world, but to be effective it is important to select a foodstuff already consumed by the majority of the population.

“Fortification is used in a lot of different settings – we do it in Australia, for example fortifying wheat flour with folic acid, or salt with iodine,” Green says.

“However, the important thing to consider is what you fortify may differ from country to country depending on what the staple is.

“We found that fish sauce in South East Asia is a good vehicle because it’s so popular and so widely consumed.”

While the trial was focused on Cambodia, Green says a similar strategy could be adopted in other South East Asian countries affected by beriberi disease.

“Because beriberi isn’t always recognised and the onset from the initial symptoms – which can be quite mild – to death is so rapid, the best thing to do would be to prevent it in the first place,” Green says.

While the study focused on thiamine fortification, the identification of fish sauce as the food of choice for delivery could also be expanded to cover other nutritional deficiencies.

Green says his team has also considered the possibility of using fish sauce to deliver vitamin B2.

– Thomas Luke

This article was first shared by The Lead on 12 January 2017. Read the original article here.

BioClay to create healthier food futures

A University of Queensland (UQ) team has made a discovery called ‘BioClay’ that could help conquer the greatest threat to global food security – pests and diseases in plants.

Research leader Professor Neena Mitter says BioClay – an environmentally sustainable alternative to chemicals and pesticides – could be a game-changer for crop protection.

“In agriculture, the need for new control agents grows each year, driven by demand for greater production, the effects of climate change, community and regulatory demands, and toxicity and pesticide resistance,” she says.

“Our disruptive research involves a spray of nano-sized degradable clay used to release double-stranded RNA, that protects plants from specific disease-causing pathogens.”

The research, by scientists from the Queensland Alliance for Agriculture and Food Innovation (QAAFI) and UQ’s Australian Institute for Bioengineering and Nanotechnology (AIBN) is published in Nature Plants.

Mitter says the technology reduces the use of pesticides without altering the genome of the plants.

“Once BioClay is applied, the plant ‘thinks’ it is being attacked by a disease or pest insect and responds by protecting itself from the targeted pest or disease.

“A single spray of BioClay protects the plant and then degrades, reducing the risk to the environment or human health.”

She says BioClay meets consumer demands for sustainable crop protection and residue-free produce.

“The cleaner approach will value-add to the food and agri-business industry, contributing to global food security and to a cleaner, greener image of Queensland.”

AIBN’s Professor Zhiping Xu says BioClay combines nanotechnology and biotechnology.

“It will produce huge benefits for agriculture in the next several decades, and the applications will expand into a much wider field of primary agricultural production,” Professor Xu says.

The project has been supported by a Queensland Government Accelerate Partnership grant and a partnership with Nufarm Limited.

The Queensland Alliance for Agriculture and Food Innovation is a UQ institute jointly supported by the Queensland Government.

This article was first published by the University of Queensland on 10 January 2017. Read the original article here.

CRC funding priorities: a welcome change

Minister Greg Hunt has signalled a potentially very important change to the Cooperative Research Centres Program. He wants to have the ability to call for, or prioritise, national interest themes in future  CRC funding rounds – for both Cooperative Research Centres and CRC-Projects. The CRC Association fully supports the Minister’s move.

Priorities for CRC funding rounds are not new. A number of existing CRCs were established as a result of the “priority public good” stream under the previous Labor Government. Ministers have often signalled several priority areas at the commencement of the funding round.

However, sometimes the priorities given were simply too vague to garner a meaningful response – I well remember debates about what “social innovation” meant when it was given as a priority. Calls for CRCs out of sync with the normal competitive funding round have also occasionally caused some confusion.
 
Through his media release today, Minister Hunt is doing things a bit differently. Firstly, he is seeking the views of the community on what issues should be prioritised.

Secondly, he is clear that any prioritised areas will need to be competitive and assessed on their merits in line with the normal processes.

Thirdly, and very importantly, he has said that the CRC program is open to all sectors and any prioritised areas will be in the national interest.

He has even gone further and named some example areas that many people would perceive as excluded by the current guidelines. 

The fast turnaround for consultation will allow for the coming Round 19 of the program to be impacted by the change.

– Tony Peacock

CRC funding

Tony Peacock is the CEO of the CRC Association and founder of KnowHow.

This article on CRC funding was first shared by the CRC Association on 21 December 2016. Read the original article here.

You might also enjoy:

The spirit within

Tackling autism diagnosis on a national level

Autism is a neurodevelopmental condition characterised by behavioural differences in children, but autism diagnosis is far from straightforward.

Now, the Cooperative Research Centre for Autism Diagnosis (Autism CRC) and the National Disability Insurance Agency (NDIA) have joined forces to implement a national guideline for diagnosing Autism Spectrum Disorder.

The system will improve the highly variable and often delayed diagnoses currently delivered across different state health systems.

This initiative comes at a time when authorities such as the Australian Medical Association (AMA) have recognised autism diagnosis in Australia as an issue in urgent need of attention. Earlier this month, the AMA announced that the speed of diagnosis is of primary concern. 

Over the course of the next year, Professor Andrew Whitehouse, Director of the Autism Research Team at the Telethon Kids Institute, will spearhead collaborative research efforts to establish a national guideline to be published by September 2017.

One of the primary aims of the guideline is to streamline the diagnostic process across Australia and thereby accelerate vital, early-stage diagnoses.

Tackling variability in autism diagnosis

In developing the new guideline, the Autism CRC and NDIA hope to address problems that are rooted as much in the state-run approach to the diagnostic process as they are in the nature of autism itself.

“We don’t know enough about the genetics and neuroscience of autism, so we diagnose based on behaviour,” says Whitehouse. “And the way we appraise the particular behaviours differs quite considerably across states.”

According to Whitehouse, some states may require only one medical health professional to carry out a diagnostic assessment, while others mandate that every patient be consulted by a series of interdisciplinary teams. The level of diagnostic training and tools of assessment also vary greatly across regions, and between rural and metropolitan areas.

These factors impact not only the diagnostic outcome, but also the cost and time involved in reaching a conclusion.

“The variability in how we appraise behaviour associated with autism in Australia has a major effect on the cost of an assessment and the waitlist involved,” says Whitehouse.

A recent Australian study suggested that in Australia, autism diagnosis occurs around three to four years later than recommended, with early treatment key to limiting the effects autism has on an individual’s life.

Given the lack of a standardised, transparent approach to autism diagnosis across Australia, Whitehouse believes some families feel like they have to seek out multiple opinions. Not only does that delay the diagnosis, but it also adds to the emotional and financial strain for families, says Whitehouse.

“In the end, a delay is a cost to the family, as well as the Commonwealth government.”

Working with families for families

Over the course of the next year, the research team plans to work with families, individuals on the spectrum, autism experts, doctors, and service providers to make sure that the national guideline addresses the key issues faced by families and individuals on the autism spectrum today.

Their goal is to create an environment where families and individuals on the autism spectrum of all ages feel that they can trust in the process and can expect equal procedures across the whole of Australia.

“The main focus is not just rigour, but what is feasible to administer on the ground and what is acceptable to families,” says Whitehouse.

Along with the publication itself, plans for distributing the national guideline include extensive training of doctors and medical staff, as well as awareness campaigns for families.

Accelerated access to treatment

The Autism CRC and NDIA hope that a national approach to tackling autism diagnosis will lead to a smoother and more efficient diagnostic process, accelerating access to treatment and effecting more equitable outcomes for everyone living with autism.

“The national guideline is an important way to get all children with autism off to the best start in life, so that every child is afforded equal opportunities,” says Whitehouse.

A successful implementation of the guidelines could also set an example for agencies handling other disabilities.

“With this project, we hope to demonstrate that nationally harmonised protocols in the area of childhood disability are possible, particularly through collaboration with Government agencies,” says Whitehouse.

– Iliana Grosse-Buening

Autism CRC aims to provide the national capacity to develop and deliver evidence-based outcomes through its unique collaboration with the autism community, research organisations, industry and government. Find out more here.

You might also enjoy:

Biobank speeds autism diagnosis

Coastal flooding tool aids communities at risk

Coastal Risk Vanuatu is an open access website created to give individuals, residential groups, and local and national governments awareness and knowledge of how coastal communities in Vanuatu will be affected by sea level rise and coastal flooding.

Developed by NGIS Australia and the CRC for Spatial Information (CRCSI), the website is meant to empower people living on the coast to take proactive steps to act on sea level rise.

“The Coastal Risk Vanuatu website will build awareness regarding the challenges that Vanuatu faces with climate change, and will ultimately lead to more effective decision making”, says Director General of Climate Change Vanuatu, Jesse Benjamin.

Coastal Risk Vanuatu is a new initiative that builds on the work of the Pacific Island Coastal Inundation Capacity Building project and the Vanuatu Globe – previous research conducted by NGIS Australia and CRCSI in 2014.

This project, funded by the Australian Government, provided hands-on knowledge about mapping the coastline. It delivered coastal mapping and risk assessment capacity building and training to 190 people in four Pacific nations.

Coastal Risk Vanuatu is an open interactive sea level rise platform, based on the Vanuatu digital elevation model. It incorporates social media photos and Pacific Community UAV imagery captured during the first response recovery post Cyclone Pam in 2015; demonstrating the value of imagery during disaster recovery.

“Building on the technical capabilities drawn from Australian research agencies, we now have the ability to accurately map coastlines to understand the impact of changing sea levels”, says Dr Nathan Quadros, Program Manager at CRCSI.

“Given our previous work in the Pacific Islands and the strong ties we have developed in the region, it is fitting that we extend our knowledge and expertise to vulnerable coastal communities, governments and NGO’s,” says Quadros.

“Through this easy-to-use sea level rise visualisation tool Vanuatu will have access to the best information for their coastal adaptation planning”.

Insight into the impact of rising sea level is hoped to aide Government and local agencies and guide stakeholders through better policy decisions. It will also assist NGO’s and emergency services to prepare for worse-case scenarios during coastal storms and flooding.

“With growing interest in the Pacific Region to be “climate ready”, we envisage further localised coastal risk websites to be developed in the coming months”, says Quadros.

“We encourage you to explore the layers and coastal knowledge captured in this website and provide feedback to info@coastalrisk.com.au”.

– Jessica Purbrick-Herbst

This article on the coastal flooding webtool was first shared by the CRCSI on 14 December 2016. Read the original article here.

Snakebite deaths prevented by Aussie expertise

Featured image above: Myanmar Union Minister of Industry U Khin Maung Cho films venom milking of a tiger snake at Venom Supplies to treat snakebite in South Australia last month.

Australian snake and health experts are part way through a three-year project to protect Myanmar’s 55 million inhabitants from snakebites by boosting the quality and quantity of antivenom supplies, establishing distribution networks and educating residents and health workers how to effectively treat and prevent attacks.

Australia is home to the world’s deadliest snakes including the Inland Taipan, Eastern Brown, Belcher’s Sea Snake and Mainland Tiger Snake. This has led to Australia becoming a world leader in antivenom development and snakebite treatment and prevention strategies.

The Myanmar Snakebite Project began in late 2014 when the Australia Department of Foreign Affairs and Trade awarded the University of Adelaide $2.3 million for a three-year project, which is a partnership between Australian Government and the Myanmar Ministries of Industry & Health.

Two years on, a visit to the Australian city of Adelaide by Myanmar Union Minister of Industry U Khin Maung Cho has helped boost the profile of the project and strengthen ties with South Australia.

The Minister used the visit in late November to learn more about the project and the world-class snake venom facilities in South Australia and also to find opportunities for further collaboration.

Royal Adelaide Hospital Renal Physician, Dr Chen Au Peh, is heading up the project with Women’s and Children’s Hospital Toxinologist, Professor Julian White, and University of Adelaide Senior Lecturer in Public Health, Dr Afzal Mahmood.

Snakes, primarily Russell’s vipers and cobras, bite thousands of people in Myanmar every year and lead to hundreds of deaths. They are a major concern in rice growing regions along the country’s biggest river the Irrawaddy.

While not as deadly as some Australian snakes, Russell’s viper is a particularly dangerous snake because of the devastating impact its venom can have on the kidneys.

Up to 70% of acute kidney failure in Myanmar is due to snakebite, placing a major strain on the country’s underdeveloped health system.

Dominated by rice, agriculture is Myanmar’s major industry, accounting for about 40% of GDP and 60% of employment.

Rats and mice are attracted to the crops, which in turn attract the snakes.

Although the majority of snakebites occur in rural farming areas and many victims seek help from traditional healers rather than through the official health system, data has previously only been collected at major hospitals in Myanmar.

Professor White, one of Australia’s pre-eminent toxinologists, says simple steps such as encouraging farmers to wear boots and seek help quickly from health care workers rather than relying on traditional healers could make a significant difference.

He says the scope of the snakebite problem will become apparent as data is collected throughout the project beyond what has been captured at major hospitals.

“We don’t know the real figures for Myanmar yet, the official figure is 600 deaths and 13,000 cases per year but we think that figure will increase by a factor of between two and five once we’ve got more accurate data,” White says.

snakebite

Dr Peh says the project was unique in its approach because it involves working closely with people in Myanmar at all levels to ensure the system being established is sustainable beyond the life of the project.

He says the holistic three-step approach includes increasing the quality and quantity of anti-venom supplies produced in Myanmar, establishing reliable distribution networks and educating health workers and the general population about how to treat and prevent snakebites.

The project has so far focused in the region of Mandalay, the biggest rice growing region and one of the worst affected by snakebite.

Horses are used in Myanmar to produce antibodies to make anti-venom for Russell’s viper and cobra bites leading to Australian veterinary, horse husbandry experts and top-tier antivenom producer Seqirus being called on to provide advice.

Since the snakebite project started, horse mortality has been reduced by 90% while antivenom production has more than doubled to almost 100,000 vials a year.

Thirty solar-powered fridges have been purchased to store antivenom in remote areas in Myanmar and thousands of rural families have been educated about how to avoid being bitten and what to do if they are.

Dr Mahmood says South Australia was also well placed to share its health expertise with the nation beyond the snakebite project.

“We have been able to run refresher snakebite training for 200 doctors, we have run training for more than 200 primary health care workers, we have been able to reach 4500 families and provide them education in their homes, we have been to 150 villages and held community meetings,” he says.

snakebite

Dr Mahmood, standing left, teaches doctors and nurses in Myanmar about snakebite prevention and treatment.

“We have the skill set and there is also a huge potential to collaborate on the health side of things with the development of hospitals, health services training, hospital maintenance and it goes on.”

White says the project is on track to have a significant impact by the time the current Australian Government funding runs out in 2018.

“By then the antivenom production will be meeting the entire national need, the distribution will be sorted out so it gets to where it is needed we will have assisted in teaching master trainers to provide sustainable ongoing training for all staff levels within their health system.

“Myanmar has a production capacity and thanks to our input and their hard work they have the potential to produce more anti-venom than they need so they could become an exporter in the region.”

“Snakebite is very much a major problem in the rural tropics – it’s not just isolated to Myanmar – if we cross over the border into India, we know there are upwards of 45,000 people dying every year from snakebite.

“We think there’s an opportunity here to make this a much bigger and much longer-term enterprise involving skills from Australia and skills developed in Myanmar and pushing them out more broadly to the region.”

– Andrew Spence

This article was first published by The Lead on 13 December 2016. Read the original story here

Top stories of the year

Featured image above: AI progress makes history – #2 of the top stories in STEM from 2016.

1. New way to cut up DNA

On October 28, a team of Chinese scientists made history when they injected the first adult human with cells genetically modified via CRISPR, a low-cost DNA editing mechanism.

Part of a clinical trial to treat lung cancer, this application of CRISPR is expected to be the first of many in the global fight against poor health and disease. 

2. AI reads scientific papers, distils findings, plays Go

Artificially intelligent systems soared to new heights in 2016, taking it to number 2 on our list of top stories. A company called Iris created a new AI system able to read scientific papers, understand their core concepts and find other papers offering relevant information.

In the gaming arena, Google’s DeepMind AlphaGo program became the first AI system to beat world champion, Lee Se-dol, at the boardgame Go. Invented in China, Go is thought to be at least 2,500 years old. It offers so many potential moves that until this year, human intuition was able to prevail over the computing power of technology in calculating winning strategies. 

3. Scientists find the missing link in evolution

For a long time, the mechanism by which organisms evolved from single cells to multicellular entities remained a mystery. This year, researches pinpointed a molecule called GK-PID, which underwent a critical mutation some 800 million years ago.

With this single mutation, GK-PID gained the ability to string chromosomes together in a way that allowed cells to divide without becoming cancerous – a fundamental enabler for the evolution of all modern life. GK-PID remains vital to successful tissue growth in animals today. 

4. Data can be stored for 13.8 billion years

All technology is subject to degradation from environmental influences, including heat. This means that until recently, humans have been without any form of truly long-term data storage.  

Scientists from the University of Southampton made the top stories of 2016 when they developed a disc that can theoretically survive for longer than the universe has been in existence. Made of nano-structured glass, with the capacity to hold 360TB of data, and stable up to 1,000°C, the disc could survive for over 13.8 billion years. 

5. Mass coral bleaching of the Great Barrier Reef

The most severe bleaching ever recorded on the Great Barrier Reef occurred this year. Heavy loss of coral occurred across a 700km stretch of the northern reef, which had previously been the most pristine area of the 2300km world heritage site.

North of Port Douglas, an average of 67% of shallow-water corals became bleached in 2016. Scientists blame sea temperature rise, which was sharpest in the early months of the year, and which resulted in a devastating loss of algae that corals rely on for food. 

6. Climate protocol ratified – but Stephen hawking warns it may be too late

On the 4 November 2016, the Paris Agreement became effective. An international initiative to reduce greenhouse gas emissions and control climate change, the Paris Agreement required ratification by at least 55 countries representing 55% of global emissions in order to become operational.

So far 117 countries have joined the cause, with Australia among them. But some of the world’s greatest minds, including Stephen Hawking, believe time is running out if the human race is to preserve its planet. 

7. Young people kick some serious science goals

A group of high schoolers from Sydney Grammar succeeded in recreating a vital drug used to treat deadly parasites, for a fraction of the market price.

The drug, known as Daraprim, has been available for 63 years and is used in the treatment of malaria and HIV. There was public outcry in September when Turing Pharmaceuticals raised the price of the drug from US$13.50 to US$750. 

In collaboration with the University of Sydney and the Open Source Malaria Consortium, a year 11 class at Sydney Grammar created the drug at a cost of only $2 per dose, and made their work freely available online.

8. Gravitational waves detected

Albert Einstein’s general theory of relativity was confirmed in February, when scientists observed gravitational waves making ripples in space and time. 

Gravitational waves are thought to occur when two black holes merge into a single, much larger, black hole. They carry important information about their origins, and about gravity, that helps physicists better understand the universe. 

The gravitational waves were observed by twin Laser Interferometer Gravitational-wave Observatory detectors in Louisiana and Washington. Australian scientists helped to build some of the instruments used in their detection.

9. Moving away from chemotherapy

Researchers at the University College London made a leap forward in cancer treatment when they found a way to identify cancer markers present across all cells that have grown and mutated from a primary tumour. They also succeeded in identifying immune cells able to recognise these markers and destroy the cancerous cells. 

This breakthrough opens the door not only for better immuno-oncology treatments to replace the toxic drugs involved in chemotherapy, but also for the development of personalised treatments that are more effective for each individual.

10. New prime number discovered

The seventh largest prime number ever found was discovered in November. Over 9.3 million digits long, the number 10223*231172165+1 was identified by researchers who borrowed the computer power of thousands of collaborators around the world to search through possibilities, via a platform called PrimeGrid. 

This discovery also takes mathematicians one step closer to solving the Sierpinski problem, which asks for the smallest, positive, odd number ‘k’ in the formula k x 2n + 1, where all components of the formula are non-prime numbers. After the discovery of the newest prime number, only five possibilities for the Sierpinski number remain.

– Heather Catchpole & Elise Roberts

If you enjoyed this article on the top stories of the year, you might also enjoy:

Gravity waves hello

Have a story we missed? Contact us to let us know your picks of the top stories in STEM in 2016.

STEM work experience exciting the next generation

Featured image above: Nat Chapman recently welcomed a year 10 STEM work experience student, Isabella, to gemaker

Think back to your formative years. Was there an experience that inspired you follow the career path you did? Or a person who made a difference in the choices you made?

If we truly want to attract the brightest minds to science and technology, STEM companies have a responsibility to inspire the next generation of innovators.

We have a responsibility to give opportunities to high school and university students in the form of STEM work experience and access to our staff.

And a responsibility to make those opportunities genuine, inspiring experiences – not just something to tick a box.

A week in the life of gemaker

When a work experience student came knocking on gemaker’s door, we had one warning for her – we don’t do boring.

Photocopying was off the cards.

Instead, she spent a busy week meeting researchers, assisting with events, attending client meetings and working on projects that gave her real insight into the world of research, commercialisation and start-up culture.

In a single week, gemaker’s work experience student:

  • attended the AGM of an ASX-listed mining company and spoke to shareholders and directors;
  • watched researchers training in how to pitch to industry;
  • toured a university robotics lab;
  • filmed scientists with a videographer;
  • visited a start-up technology company;
  • went to a business meeting with a potential client;
  • helped create an infographic explaining the commercialisation of research;
  • compiled survey data;
  • wrote an article on her experience for the gemaker website.

Through it all, the student was a delight to take out.

She asked interesting and intelligent questions, and the enthusiasm she showed reminded us why we got into this business in the first place.

Yes, it can be challenging to design a program for a STEM work experience student.

Yes, it might be easier to point them at the lunchroom and the photocopier.

But if a small business like gemaker can do it, imagine the opportunities large, established companies and research organisations might be able to offer.

With a STEM work experience student, you win too

Taking on a work experience student can be exciting and have huge personal rewards for you too. A student can help you revitalise, recharge and remember what you love about your profession. It is inspiring to watch them be inspired.

Students can offer a different viewpoint, new ideas and a two-way learning opportunity that might surprise you. Why not ask a student how they think you could improve your social media presence?

Work experience is pivotal to the choices kids make in upper high school and beyond.

If we want to see more students in STEM, and believe passionately in the value of science and innovation, we have a social responsibility as a STEM organisation to provide genuine opportunities for students.

If we don’t make time for the next generation, we’re losing a massive opportunity to show what researchers can do.

Where to start

If you’re not sure how to go about inviting students into your workplace, here are three steps you can take this week:

  1. Tell staff that STEM work experience opportunities are available if they know students with a keen interest in science.
  2. See what STEM work experience programs are running at your own child’s school, and if you can contribute.
  3. Reach out to your local high school (start with the principal) to offer your services to the school.

You have the power within your hands to totally inspire a student or utterly turn them off.

At gemaker, we don’t have all the answers but we’re doing our bit.

And if each of us contributes, we can inspire the next generation and attract the brightest young minds to science and innovation.

– Natalie Chapman, gemaker

commercialisation

Connecting Women Leaders in STEM

Jo Stewart-Rattray heads ISACA’s Connecting Women Leaders in Technology program, dedicated to developing women leaders in STEM.

Deloitte Global projects less than 25% of IT jobs in developed countries will be held by women at the close of 2016. My hope is that the women graduating in STEM careers this year quickly find employment in roles they can enjoy, learn and grow from, and become successful in their careers.

Of course, my wish is the same for men who are also graduating at this exciting and disruptive time in business. However, the female student’s journey to graduation and beyond is very different to that of men.

For example, female students in STEM are often the only one in their class. I have sat in many boardrooms where I am the only woman in the room. I’ve also been the only woman at conferences on information security.

Over my 25 year career, not much has changed, and I know from speaking with other women leaders in STEM that they have had similar experiences. This is not just an Australian issue. It is a problem across the globe.

A study of 22,000 global public companies by Peterson Institute for International Economics and EY shows that the net profit margin of a company can be increased by more than 6% if a company has a minimum of 30% women in the C-suite.

Most importantly, without women in the workforce, we simply won’t have the resources to continue to fuel the job economy and innovation.

So what can be done to develop women leaders in STEM?

In my experience, a multi-faceted approach is needed. It involves:

  • businesses providing flexible work options;
  • connecting their employees with both men and women leaders in STEM for mentoring;
  • sponsoring and encouraging young professionals to understand their potential career paths and rewards; and
  • instilling in female students the confidence to follow their passion and be resilient.

In terms of mentoring, I learned early on to find men and women role models and mentors. I was able to do this through ISACA, a professional organisation for IT audit, risk, governance and cybersecurity professionals. My membership and involvement in ISACA enabled me to network with local and global peers, who really helped encourage and guide me in my career.

And now, I am incredibly humbled to spearhead ISACA’s Connecting Women Leaders in Technology program, which aims to inspire and engage women to grow and become leaders in our field.

It has been an enriching and rewarding experience to see young professionals excel by following their passion. 

So my message to future women leaders in STEM is to ‘Go for it!’ Have the resilience and confidence to seek the career you want, and find a mentor or bright star who can help guide you along the way.

Together, we will all prosper and learn from one another, as we innovate and create in the years to come.

Jo Stewart-Rattray

women leaders in STEM

CISA, CISM, CGEIT, CRISC, FACS CP

Board director of ISACA

Director of information security and IT assurance at BRM Holdich, Australia

Hear from other Australian leaders on how to support women in STEM in the Women in STEM Thought Leadership Series:

Women in STEM: the revolution ahead

Women’s network supports health and medical researchers

Three years ago I did something pretty scary for a scientist with absolutely no business experience – I launched Franklin Women, a women’s network for professionals working in health and medical research.

The seed was planted on a flight from Sydney to Brisbane when I read a book I picked up at the newsagent called ‘Do cool sh*t’. It was an inspiring but quick read so I also had time to flick through a few pages of a Marie Claire magazine.

That month they had written on the value of professional networking and showcased a number of groups for women working in different sectors, from business to law. The idea of such a group immediately appealed to me.

I had just started a new job, in a new research area and in a new city, so I had a limited professional network. So after that flight I started ‘googling’ for a group for women in health research careers that I could join. But … there weren’t any! So in a moment of craziness I decided that I would start one.

After six months of researching what the needs were in the sector and getting my head around all the business bits and pieces, Franklin Women was launched – a social enterprise aimed at bringing together women working across the health and medical research sector to create opportunities for networking as well as personal and professional development.

It turns out I wasn’t the only one who wanted to connect with other women in my field. Nearly 100 women turned up to our launch event ‘Let’s Meet’. Since then we have grown to over 400 professional members representing women in diverse organisations, roles and career levels within the health and medical research sector.

Over the last few years I have been lucky enough to meet many of these women and discuss why a group like Franklin Women is so valuable. The reasons that come up are varied and many, but the same three always stand out. 

What makes a women’s network so valuable?

Support and understanding 

As in other sectors, women are under-represented in leadership positions in the health and medical research sector due to a number of systemic and cultural barriers.

Meeting with a women’s network of like-minded peers who have experienced the same challenges as you can take away feelings of isolation. It also provides an opportunity to share ways to overcome any challenges as well as resources that are out there to support career progression.

But most importantly, you will always find a compassionate ear from someone who understands what you are going through. That in itself is invaluable.

Career connections outside your immediate circle 

Collaboration is something that underscores successful research. However, there are limited opportunities to connect with professionals in different research groups of the same institution let alone those in different organisations or even different roles.

One of the great things about Franklin Women is that we connect women who have a common passion of improving health but otherwise may never have connected.

At any one event we have university academics mixing with policy advisors, epidemiologists with lab scientists, and those working at hospitals with museum curators. The opportunities that come from these diverse connections are endless.

Learning new skills outside of the technical sciences

Researchers have invested in many years of study so that they are experts in their chosen technical area. With all that science to learn it leaves little room for training in non-technical career skills that are just as important for career progression.

Like other professional networks, Franklin Women provides the opportunities for learning broad professional skills, from networking and mentoring to using social media effectively. Not only can these skills be incorporated into academic careers but they are also seen as transferrable to roles outside of academia.

As we are finally entering an era where a successful career in science is moving past a single trajectory in academia, acquiring these skills is essential.

More opportunities for networking in the sciences are popping up around Australia so think about joining a women’s network. You never know what you may get out of it… a new collaboration, a new job opportunity or if nothing else just some good company! 

Dr Melina Georgousakis

women's health

Senior Research Fellow, National Centre for Immunisation Research and Surveillance, Sydney

Founder, Franklin Women

Find us on Twitter and Facebook.

If you enjoyed this piece on the women’s network for health and medical researchers, Franklin women, you might also enjoy:

Connecting Women Leaders in STEM

Young innovators from Australia honoured in MIT awards

Featured image above: young innovators from Australasia. Top (L-R): Angela Wu, Dawn Tan, Wang Gang, John Ho and Prateek Saxena. Bottom (L-R): Simon Gross, Sumeet Walia, Yong Lin Kong, Zhi Weh Seh and Dhesi Raja. Credit: MIT Technology Review

EmTech Asia, in association with MIT Technology Review, today announced the top 10 young innovators under the age of 35 in the region. The 10 ‘Innovators Under 35’ are given tribute annually at MIT Technology Review’s EmTech Asia conference.

The list recognises the development of new technology and the creative application of existing technologies to solve global problems in industries such as biomedicine, computing, communications, energy, materials, web, and transportation.

EmTech Asia’s Disruptive Innovation Partner, SGInnovate, will host the ‘Innovators Under 35’ segment, where its Founding Chief Executive Officer, Steve Leonard, will present the young innovators with their award. 

“We want to encourage and support innovators who have the courage to embrace risk, and the vision to do important work on difficult challenges,” says Leonard.

“Our hope is these amazing young innovators will want to see their science and technology-based work increase its positive impact through active commercialisation efforts with teams such as SGInnovate.”

Now in its fourth edition, Asia received nominations from researchers, inventors and entrepreneurs across nine countries (Singapore, Malaysia, Thailand, Philippines, Indonesia, Vietnam, Taiwan, Australia and New Zealand) to be considered for the 2017 list. This year, the list of 10 brilliant researchers and entrepreneurs come from Singapore, Malaysia and Australia. 

Young innovators from Australasia

  1. Dawn Tan, 33, Assistant Professor, Engineering Product Development, Singapore University of Technology and Design (SUTD), SINGAPORE. Dawn receives the award for developing complementary metal-oxide-semiconductor (CMOS) nonlinear optical devices for unprecedented nonlinear photon efficiencies in multi-wavelength sources. Her research brings cheaper light sources to the chip, enabling 100X better bandwidth capacity in the transmission of data.
  1. Gang Wang, 34, Associate Professor, School of Electrical and Electronic Engineering & Associate Director of the ROSE Lab at Nanyang Technological University (NTU), SINGAPORE. Gang is recognised for his work in artificial intelligence and deep learning that will benefit industries such as mobile, virtual/augmented reality and self-driving cars. He founded Ultramind, which provides core artificial intelligence technologies including object detection, optical character recognition (OCR), and action recognition.
  1. John Ho, 27, Assistant Professor, Department of Electrical and Computer Engineering, National University of Singapore (NUS), SINGAPORE. John is awarded for his pioneering research on developing wireless technologies for bioelectronic systems that can be used to help treat intractable diseases like cancer and diabetes. By enabling smaller and deeper bioelectronic devices, these technologies could one day enable doctors to prescribe a tiny, wireless device instead of a pill.
  1. Prateek Saxena, 33, Dean’s Chair Assistant Professor, School of Computing, National University of Singapore (NUS), SINGAPORE. Prateek’s expertise is in cybersecurity. His work on symbolic tracing has being used to discover security flaws in Microsoft’s largest web product and his work on auto-sanitization of web programs to make them robust against attacks has already been adopted in Google Chrome’s extension platform and Google’s web compilation infrastructure.
  1. Zhi Wei Seh, 30, Research Scientist, Institute of Materials Research and Engineering, Agency for Science, Technology and Research (A*STAR), SINGAPORE. Zhi Wei receives the award for designing advanced materials for clean energy storage and conversion. His pioneering design of sulfur-titanium dioxide yolk-shell structures for lithium-sulfur batteries, have five times the energy density of lithium-ion batteries today.
  1. Yong Lin Kong, 29, Postdoctoral Associate, Massachusetts Institute of Technology (MIT), MALAYSIA. Yong Lin was nominated for his work on developing next-generation ingestible electronic devices that could improve the quality of life for patients with diseases that require long-term on-demand drug administration.
  1. Dhesi Raja, 32, Cofounder, Artificial Intelligence in Medical Epidemiology (AIME), MALAYSIA. Dhesi receives the award for his work in artificial intelligence in medicine. His AIME platform has the capability of identifying dengue and Zika outbreaks up to three months in advance.
  1. Angela Wu, 31, Founding Member and Scientific Advisor, Agenovir Corporation, AUSTRALIA. Angela was instrumental in launching Agenovir, which uses genome editing technologies to cure chronic viral infections. Using genome editing technologies to target destruction of viral DNA instead of human DNA, Agenovir’s future products will be able to remove these viruses from the cell, resulting in a permanent cure.
  1. Simon Gross, 33, ARC DECRA Research Fellow, Department of Physics and Astronomy, Macquarie University, AUSTRALIA. Simon is recognised for his work in integrated optics. He developed a fabrication process that enables integrated optics access to the third dimension, using a laser to sculpt optical circuits embedded in a block of glass, a process similar to 3D printing, which is being used to develop the next generation of ultra-high bandwidth optical communication networks.
  1. Sumeet Walia, 28, Lecturer, Royal Melbourne Institute of Technology (RMIT), AUSTRALIA. Sumeet is noted for his work in nanoelectronics. He specialises in the use of metal oxides for the next-generation of high performance electronic devices and systems.

The 10 honourees will give elevator pitches about their work at EmTech Asia, which will be held from 14–15 February 2017 in Singapore.

The 10 also automatically qualify for consideration on the global MIT Technology Review magazine ’35 Innovators Under 35 List’. MIT Technology Review will showcase these 35 innovators in the September/October 2017 issue. 

This information was first shared by MIT Technology Review on 7th December 2016. View previous years’ Innovators under 35 here.

Blueprints to a collaboration boom

Featured image above: Robin Knight (right) and Patrick Speedie (left) are cofounders of university-industry collaboration platform IN-PART. Credit: IN-PART

Robin, you’re four years into the IN-PART journey, and you’re already connecting 70% of your university opportunities with potential partners. Can you take us back to the start, and tell us how you first came to be interested in university-industry collaboration?

Prior to setting up IN-PART I was in academic research at King’s College London. I was always interested in collaborating with industry partners, especially when working in an area with potentially translatable outputs.

While undertaking my PhD I started working on an academic-to-academic platform with a couple of colleagues, and during that time I had a conversation with my now co-founder and long-time friend, Patrick Speedie, who was working in IP management and publishing. Our shared experiences and discovery of the need to better connect the two worlds of academia and industry motivated us to form university-industry collaboration platform IN-PART.

Tell us a bit more about IN-PART and how it gained traction?

At its core, IN-PART a tool to help Tech Transfer teams (and by extension researchers) find external partners interested in their research. The translation of academic research into impactful outputs is key to the advancement of society, and we wanted to be a key part in increasing those outputs.

So we began by building a network of individuals in industry who were both capable and motivated to interact with universities about research. Then we had to figure out the best and most efficient way to showcase opportunities to them.

After piloting a minimum viable version of IN-PART with six UK universities in 2013, we managed to find 25% of provided opportunities with potential industry partners in just two months. Three years and two investment rounds later, we now provide over 70% of each university’s content with potential partners.

IN-PART is all about university-industry collaboration. Why did you choose to focus on universities in particular?

We use the broader term of universities to represent publicly-funded research. Amongst these we will also include research institutions, and notably we recently welcomed Public Health England to IN-PART. They are a very interesting case as the outputs from a government lab differ from those of a traditional research institute, owing to the more hazardous bio-projects they undertake and different potential technologies that result.

Our industry audience are often seeking to access the academic behind available IP, especially if considering a license. It’s rare that a company would be able to take a technology and have it fit directly into their research pipeline – expertise is required for guiding that fit and this makes universities and research institutions such an attractive resource.

An important element of what we do is making sure all the content we have is ‘available’. This means we do not ‘scrape’ websites for technology nor trawl the internet, which turns up expired patents and technology where the academic is no longer associated. Instead we keep in close communication with university teams to make sure everything we have is relevant and up to date.

We do not work with company or industry generated IP seeking licensees. We also never want to be in the industry of trading IP for the sake of litigation, which from my personal point of view seems to counter our progression as a species.

I’ve noticed that at IN-PART, you restrict your platform to particular industry professionals. Have you found this to be important to the success of your collaboration model?

Yes, very important. When we first piloted IN-PART in the UK under a beta-test with six universities, it was clear that we wanted to only provide introductions to end-users in industry. By restricting our audience in this manner it meant that every contact we passed along was meaningful and high-value. What we didn’t want to do was pass on opportunities to work with consultants. That being said, consultants provide a valuable component within the ecosystem and we’re currently exploring how they can be included within our community.

To hear more from Dr Robin Knight about the key drivers behind successful commercialisation and collaboration, click here.

profile_inpartrobin

Dr Robin Knight is Co-founder and Director of UK-based university-industry collaboration platform IN-PART.

Click here to find out more about opportunities with IN-PART. To find more industry-ready technology from Australian universities, visit Source IP.

Key drivers behind successful commercialisation

Featured image above: Robin’s team driving successful commercialisation and university-industry collaboration at IN-PART. Credit: Jennifer Wallis, Ministry of Startups

Robin, it’s great to have you with us to share your insights into successful research-industry partnerships. Let’s start with universities. In your experience, what factors make a university’s research most ripe for application by industry?

That’s a good question, and one that doesn’t have an easy answer! It’s entirely dependent upon the sector, the company, and what they’re seeking from a university. We’ve never pigeonholed ourselves as being a ‘commercialisation platform’ per se, as we believe that university-industry collaboration in all forms can lead to great outcomes.

Some of the best instances of successful commercialisation have occurred alongside goals for longer-term strategic partnership with a research program. End results in this instance include funding for studentships, secondments, and research commercialisation on a large scale. By virtue of this, the earlier relationships can be established the better.

I’m a complete believer in ‘research for research’s sake’, but for programs designed to have societal impact, the best way of achieving it is with a commercial partner in mind from the beginning.

What have you found universities who’ve achieved successful commercialisation do better than others?

University tech-transfer teams have numerous roles to fulfil, and one of those is to manage two often very different mindsets and expectations when it comes to their academics and potential partners in industry. Their role is a crucial one, and being a steadfast, efficient liaison is key. That means being responsive, knowledgeable and more often than not, flexible to both the needs of the academic and industry partner.

In the first instance people need to speak, and if there are prohibitory conditions and pensive overseers during initial dialogues, it can sully a relationship from the beginning, which at its core relies upon growing and nurturing trust between parties. That being said, it’s a tough line to walk, but the best are those most willing to participate in the first instance.

What factors have you found to be vital to both forming and maintaining successful collaborations between research and industry?

Technology transfer in the university sector benefits from great membership networks, with KCA in Australia, Praxis in the UK, ASTP-Proton in mainland Europe, and AUTM in the US. These networks promote best practice amongst the community, and it’s always great to hear people sharing experiences whilst networking.

Owing to this openness within the community there’s been a rapid evolution for adopting new tech-transfer techniques (that work). From our experience it is those people who are most amenable to engage with new initiatives and alter how they interact, who work best. That means making the most of existing networks and proactively expanding them at conferences, on the phone, through Linkedin, and of course, through IN-PART.

Additionally, feedback from industry tells us that university websites are labyrinthine, and the sites that work best do not showcase the internal complexities of organisations, but have key individuals for contact regarding broad academic sectors. These people provide triage on inbound inquiries, directing them through the most efficient channel; essentially taking the work off potential partners who might struggle to identify who it is they should speak with in the first instance.

To hear more from Dr Robin Knight about breaking down barriers to university-industry collaboration, and emerging trends in university-industry partnerships, click here.

profile_inpartrobin

Dr Robin Knight is Co-founder and Director of UK-based university-industry collaboration platform IN-PART.

Click here to find out more about opportunities with IN-PART. To find more industry-ready technology from Australian universities, visit Source IP.

Global collaboration and emerging trends

Featured image above: global collaboration. Credit Eric Fischer, Flickr

Robin, having been in this space for several years, can you tell us what is different about university-industry collaboration now, compared with 5 or 10 years ago? Have you noticed any trends emerging that we might see driving partnerships in the future?

We’ve been in the space for around four years, and in this short period of time we’ve seen a shift towards greater openness between universities and industry. Local governments, especially in countries where the knowledge-economy is becoming more important as manufacturing starts to wind down, have in part aided this change. Education throughout the industry community through shared membership bodies has also been key to improving relationships.

There’s a highly cited statistic from the UK government commissioned Dowling Review, that only 2% of small and medium-sized enterprises (SMEs) would think to consult their local university if they came upon a technological challenge. This is something that needs to change. It’s crucial that governments continue to engage in improving university-industry collaboration, bringing down financial barriers which hinder interactions for smaller companies. Grants for joint projects help do this, and private grant-writing companies within the space also play a role for companies wanting to access money but unsure how to go about it.

In the UK the Impact Agenda, which formed part of the government’s Research Excellence Framework (REF) for 2014, was party to much scepticism. Universities were required to submit case studies regarding the Impact of their research on industry, governmental policy and direct public impact. The level of funding for universities was affected by the impact of these case studies which were each given a score. It meant quite a culture shift took place in UK universities, especially for academics whose funding is now directly linked to external engagement (at least partially).

IP and ownership concerns are considered by many in Australia as one of the most difficult barriers to university-industry collaboration. How can organisations do better at addressing IP?

It’s good timing for this question, as recently our Head of Growth, Owen Nicholson, was part of the group developing the UK government’s Lambert Toolkit. It was launched last week and comprises a set of contracts for use by university and industry undergoing partnership discussions. The Lambert Toolkit contracts are not set in stone, but provide a great starting place and will certainly speed up that initial discussion when it comes to IP rights. I could see these types of blueprints being used globally. Owen’s insights on the Lambert Toolkit can be found here.

The valuation of early-stage research is, to my mind, an incredibly difficult process. In some sense, this does give a potential industry partner a better stake in negotiations, but they take on larger amounts of risk in doing so. With all things contractual, it’s about negotiation and making sure both parties are comfortable with the arrangement.

Can you share with us any insights into other major global collaboration barriers?

We’re currently working on removing some other barriers, one of which is how companies access worldwide university expertise easily. Currently all I can say is ‘watch this space’, but lest to say we’re looking to further our vision of helping unlock university knowledge.

In your opinion, is there scope for better university-industry partnerships between Australia and the UK?

In our experience there should be no barriers to global collaboration and partnership, however some universities in certain locations have evolved research specialisms in line with their economy, providing cutting-edge developments within particular areas (e.g. renewable energy technology in coastal areas, or agricultural developments in areas surrounded by farmland).

Australia has a great diversity of research, developed by world-leading scientists, and our excitement at working with universities in the country is causative of our audience. Our industry users are forever keen for us to widen our breadth of technology and research available in new territories they’ve previously had little access to. For many in Europe and the U.S., especially SMEs, Australia represents such a territory.

To hear more from Dr Robin Knight about the blueprints to a global collaboration boom, click here.

profile_inpartrobin

Dr Robin Knight is Co-founder and Director of UK-based university-industry collaboration platform IN-PART.

Click here to find out more about global collaboration opportunities with IN-PART. To find more industry-ready technology from Australian universities, visit Source IP.

Why digital disruption will create your next career

Like many of you I am waiting for digital disruption to make my job redundant so I can lean out, reclaim my work-life balance and let the robots do the rest.

As a journalist, my first thought was to see how digital disruption could work for me, so I looked for an artificial intelligence that could write this article for me (it couldn’t). But it came scarily close.

While so-called artificially intelligent chatbots are at best frustrating, programs such as Wordsmith can take sets of data and generate various articles based on simple coding of parameters, while stuffing a few synonyms in to sound like a genuine journalist.

Last week, an inaccurate post titled ‘The Trump Effect: It’s Happening Already!!’ went viral, and Facebook announced it would instigate third party fact checking to crack down on fake news. Imagine a world where AI could both check the accuracy of posts, but also one in which AI could generate endless streams of viral click bait.

Need a meeting? Download an artificial assistant like Amy from x.ai to contact your clients directly and discuss suitable times. All you do is turn up.

Fancy a bite to eat? Before long autonomous vehicles will be at your beck and call to escort you to your favourite restaurant or deliver a much-loved takeaway.

Work in a construction trade or manufacturing? Robotics and 3D printing can download, print and stack your bricks, scaffolds and planking, twist your toothpaste caps on and sort quality from flawed product.

What about a highly-paid, precision career such as surgery? Google is already working with Johnson and Johnson’s medical device company Ethicon on the next generation of surgical robots – research that is based on Google’s work in autonomous cars.

Chances are if you teach and/or work in academic research, you’ll already be aware of the possibilities of massive open online courses (MOOCs) and their potential for disrupting the way we learn, and allow access to our institutions. In four years, MOOCs have gone from zero to over 4,000 courses reaching around 35 million students.

Worried? You’re not alone, a PwC survey of CEO’s globally found 62% of 1300 surveyed were concerned about the impact of digital disruption in their industry. I recently heard a leader from the giant resources company BHP talking at the AFR Innovation Summit about being a recycler rather than a producer of steel after their disastrous 2015 downturn.

But if you think digital disruption means the robots are coming for your job, you’re wrong. While just under half of our jobs are expected to be at risk of automation in the next 10–15 years, for every disrupted career area, new opportunities arise. Like writing the programming software to create news stories or humanising the language used by AIs. By researching the signals that can make autonomous cars safer for pedestrians or by understanding the psychology behind creating incentives for innovation in your staff.

Where are we most at risk from missing the opportunities from digital disruption? Our team of thought leaders have the answers.

Heather Catchpole

Managing Director and Head of Content, Refraction Media

Read next: Head of KPMG Innovate, James Mabbott, uncovers the point of difference between those who remain resilient to change and those who get left behind.

Spread the word: Help Australia become digital savvy nation! Share this piece on digital disruptors using the social media buttons below.

More Thought Leaders: Click here to go back to the Thought Leadership Series homepage, or start reading the Women in STEM Thought Leadership Series here.

Disruptive technology is more than just apps

Businesses frequently take a relatively simple view of digital disruption. In fact, it’s often not the applications that are disruptive, but the technologies and networks that power them. Rather than focusing on building the next killer app, in seeking disruptive technology, scientists and business leaders should work together and invest in the underlying technologies that change the fundamental science of how their industries operate.

Digital disruption often occurs behind the scenes, improving or streamlining the processes which define how well (or how badly) businesses and industries perform.

Apps act as simply one channel for people – whether consumers or employees – to access this disruptive technology. An “app-centric” view of disruption risks overlooking more effective ways to not only digitally transform industry practices, but also make these transformations accessible to those whom they benefit.

IoT’s disruptive technology impact

Take the Internet of Things, for example. The natural resources sector has already begun to adopt sensors, data analytics, and automation across all manner of operations, from drilling to transport and even maintenance of mining infrastructure. This disruptive technology has even percolated into not apps, but caps.

Mining3, an industry consortium made up of the CSIRO, several universities, and major mining firms, has developed a cap which monitors truck drivers’ brainwaves to detect fatigue before its deadly consequences set in.

More and more, disruptive technology comes from partnerships just like Mining3, forged between researchers and businesspeople who both seek to challenge what the status quo can deliver.

Researchers possess unique knowledge and critical faculties for tackling major industry or socio-economic issues; businesses can provide the resources, both technological and monetary, to make solutions viable on a large scale. When both parties’ goals align well, these partnerships can ensure digital disruption goes beyond the relatively trivial domain of the next social media app to catch the consumers’ fancy.

Play to your strengths

To be effective, these disruptive partnerships must play to both researchers’ and businesses’ strengths. Watson is IBM’s cognitive computing platform and a product of a collaboration with Memorial Sloan Kettering Hospital. It can deliver surprising insights and strategic advice in almost any field – as long as it has enough data and human guidance to learn from.

When seeking to develop better treatments for cancer, doctors and research analysts, Memorial Hospital provided both: thousands of hours of training, as well as more than 12 million pages of text from more than 290 medical journals.

The more IBM Watson learns from Memorial Hospital’s expert oncologists, the more effectively Watson can help doctors spot and treat cancers, disrupting traditional methods of diagnosis and care in a way that could save countless lives. Perhaps most importantly, however, these insights and capabilities are accessible to any doctor in any licensed hospital – via a simple-to-use iPad app.

As researchers and innovators, we should focus on technologies which disrupt the fundamentals of industry and society – and an app is just the tip of the iceberg in what’s possible in this Cognitive Era.

Dr Joanna Batstone

Chief Technology Officer, IBM Australia 

Vice President and Lab Director, IBM Research

Read next: Dr Joanna Batstone pinpoints what makes emerging technology so disruptive, and explains why we need to become more ambitious in our disruptive efforts. 

Spread the word: Help Australia become digital savvy nation! Share this piece on digital disruptors using the social media buttons below.

More Thought Leaders: Click here to go back to the Thought Leadership Series homepage, or start reading the Women in STEM Thought Leadership Series here.

Cognitive technology is the future, digital is simply a platform

Digital disruption is no longer confined to the online world – if indeed it ever was. We’ve already begun to see cognitive technology – technology able to perform what were traditionally human tasks – disrupt industries that we’ve previously considered as offline; from taxis to hotels and even door-to-door deliveries.

In order to innovate for tomorrow however, we need to stop thinking in terms of “online” and “offline”, because digital is simply a platform, and it’s “cognitive” that’s the future.

Living in the cognitive era

Throughout the age of digital disruption, we saw industries which have, until now, underestimated the impact that technology can have on their operations.

Now, we find ourselves in the “cognitive era” – an age in which cognitive technology can understand, reason, learn and interact with natural language, and is very quickly bridging the human and machine divide in industries which never expected to be digitally disrupted. 

We are seeing augmented intelligence transform industries which have traditionally had a relatively low demand to “go digital”; industries such as healthcare, natural resources, and even fashion.

The thought of partnering AI technology with a creative industry like fashion seemed a little bit sci fi just a few years ago, yet is now on our doorstep. 

Cognitive technology in healthcare

In healthcare, cognitive technology is already playing a key role in progressing the science of how we tackle the big health battles of today, such as cancer and chronic illness.

The number of Australians affected by cancer is expected to rise by almost 15% between now and 2020, and preventable chronic illnesses place a heavy burden on our health systems. It all comes down to early detection. Take skin cancers and melanomas for example; identifying the subtlest of changes in skin lesions as early as possible is key to a patient’s survival.

IBM Research is using image analytics and cognitive technology to help doctors identify these changes in dermatological images, and improve the rate of early detection.

The same logic applies to chronic diseases like diabetes and heart disease; the earlier we can identify at-risk patients and put them into preventative care programs, the better their quality of life; and we can also start to lessen the burden on health systems.

Disruption in creative industries

Beyond health, there are other industries ripe for disruption from cognitive technology. Governments and urban planners now count Internet of Things sensors and mobile devices amongst the tools for creating friendlier, smarter and in many ways, self-managing cities.

Even artists and designers have begun to incorporate data into their creative concepts, whether analysing past fashion trends or creating pieces that respond to digital feedback in real-time.

Embracing cognitive computing

The digital age is well and truly a given for all businesses and we must embrace this new era of cognitive computing. The emerging technologies on our doorstep – from the Internet of Things to cognitive technology to quantum computing – will make data even more powerful than it already is.

This means we need to become more ambitious in our disruptive efforts: rather than seeking to simply overturn the latest applications or digital platforms, we should focus on how to apply technology which can understand, reason, learn and interact with phenomena in the physical world, and vice versa.

Dr Joanna Batstone

Chief Technology Officer, IBM Australia 

Vice President and Lab Director, IBM Research

Read next: Joanna Batstone, discusses how scientists and business leaders can work together in disruptive partnerships.

Spread the word: Help Australia become digital savvy nation! Share this piece on digital disruptors using the social media buttons below.

More Thought Leaders: Click here to go back to the Thought Leadership Series homepage, or start reading the Women in STEM Thought Leadership Series here.

Remaining relevant in the digital age

Novelist William Gibson is credited with saying “the future is already here – it’s just not very evenly distributed”. 

This in a nutshell epitomises the challenge for mature businesses and industries. 

Their possible futures are being played out by the emerging digital versions of their existing selves. Smaller, more nimble competitors built on the infrastructure of tomorrow’s enterprises are using new tools and methodologies to disrupt established players. And they are able to do so unencumbered by legacy systems and processes of larger players.

For many in established businesses it is not a case of if but when in terms of the threat of digital disruption. But the phrase “digital disruption” hides a subtle nuance when discussing disruption in the context of business: disruption is actually a human story, not a tech one. 

Digital services and enterprises on their own do not disrupt established businesses. Rather digital services, technologies and business models enable your customers to disrupt you.

Take for example the rise of marketplace style businesses such as Uber and their impact on the incumbent taxi services. The simple fact of Uber’s existence did not in itself disrupt the taxi industry. But by offering a better customer experience, a more cost effective service and ease of use to the passenger, customer-led disruption was enabled.

If you were to look at the legacy business model for a taxi company in Australia, it focuses on the regulator, the operator and licence holder, and the driver – rarely does the passenger feature. Today, passengers can actively compare their taxi journey experience with that of the Uber model – and customers are voting with their digital wallets.

The key for incumbent large corporations to stay relevant is customer focus. This is not a new mantra – most of my working career has been spent in or working with organisations trying to achieve customer centricity.  What has changed in the last 10–15 years is the realisation that terms such as “customer ownership” are by and large meaningless. Customers are not owned. They are earned and need to be maintained. 

To do this requires an increasing emphasis on data to better understand customers and their needs. It means the use of customer journey mapping tools alongside this data to really explore the customer experience at every single touch point. It means the analysis of ethnographic studies to see how customers use products and services.

Most importantly, organisations need to bring the customer into every stage of the product development process. Old world, business case-driven product development processes need to be replaced with customer data and hypothesis-driven experiments. The product development process needs to include customer testing at every stage, from idea to prototype to final product. And this process needs to allow for customer feedback and for data to drive decision making and change along the journey.

Consumers’ experiences, and hence their expectations, are increasingly being shaped by the proximity, intimacy and aesthetic provided by their day to day interactions with a range of products and services being delivered digitally. Whether it is the beautiful simplicity of the Google search bar, the elegance of Apple design or the magic of Disneyland – the benchmark on customer experience – attraction and retention is being set globally. As a result, the customer experience needs to be judged not just against best in class for a particular industry or product segment, but against best in class – full stop.

Market leaders today who survive well into the future will look across industries in their response to digital disruption and adapt and change to the new, unevenly distributed future.

James Mabbott

KPMG partner and Head of KPMG Innovate

Read next: PwC’s Technology Innovation Leader, Dr Crighton Nichols, describes the tools that allow forward-thinking organisations to learn faster than their competitors. 

Spread the word: Help Australia become digital savvy nation! Share this piece on digital disruptors using the social media buttons below.

More Thought Leaders: Click here to go back to the Thought Leadership Series homepage, or start reading the Women in STEM Thought Leadership Series here.

New frontiers in digital disruption

The building blocks of digital technology consists of information theory (which codifies content into binary 0/1 format) and transistors (essentially on/off switches). They were both invented during the hey-day of American research and scientific development company Bell Labs in the decade following WWII. Subsequently, each new and improved wave of digitisation has caused upheaval as it visits particular markets and occupations. However, from the perspective of the whole production and consumption system, progress has been relatively slow and staggered compared to what we are likely to see in the future.

In the 1950s, computers at even the most advanced tech locations in the US comprised two-storey buildings and only performed highly specialised and limited functions. It was not until the 1980s – when smaller mainframes became cheap and fast enough to replace routine operations – that digital technology effectively eliminated the labour market for clerical workers.

Automation, robots and digitally guided technologies started making inroads into manufacturing around this time. Although satellites have been used since the 1960s to provide market intelligence for producers (giving US farmers advice on what and how many crops their competitors were growing, for example), it took until the 2010s for satellite-aided location services to become ubiquitous and part of consumers’ daily lives.

More and more, digital disruption is being triggered by innovative software, such as travel search engines and language translation services, rather than hardware. Since software can be shifted into large-scale production much faster than hardware, this accelerates the pace of disruption.

One form of software that is playing an increasingly important role is a form of artificial intelligence called ‘machine learning’. Computers are governed by algorithms comprised of many rules that dictate “if X, then do Y”. These rules are usually set by the programmer(s) that wrote the algorithm code. But things are different in the case of machine learning algorithms. Such an algorithm can ‘learn’ from data by altering its own parameters, progressively improving its ability to determine patterns or predict future trends in the data (analogous to the way our brains learn from past experience).

For example, machine learning algorithms have been used for the past two decades in spam filters. When we label emails as spam, we are generating a labelled dataset that can be used to train a machine learning algorithm to recognise the properties of emails that are usually associated with spam. The trained algorithm can then remove such emails automatically.

Machine learning has even begun transforming the oldest of professions, such as medicine and the law, hitherto considered the preserve of nuanced interpretation and experiential knowhow. Law has long resisted automation from computers and digital analytics, in part because of the non-routine nature of contracts and litigation. However, this is now changing as machine learning methods have partially automated tasks by detecting patterns and inferring rules from data.

eDiscovery is one such digital tool used to assist lawyers’ search through emails and piles of office documents to find evidence needed to clinch a case (looking for the proverbial needle in a haystack). Machine learning can disrupt the eDiscovery process by efficiently bringing together similar documents based on their contents and metadata. Brainspace provides lawyers an eDiscovery tool that increases the efficiency and accuracy of finding information pertinent to a court case. Alternatively, ROSS, a machine learning law tool, can provide answers to legal research questions, posed using natural language, and can monitor recent legal developments that are relevant to a particular case.

In medicine, machine learning algorithms are increasingly being used to help perform radiological diagnoses. They can be trained to classify medical scans as normal or diseased, or to quantify the size of diseased areas. In the area of brain cancer, Microsoft’s InnerEye research project has been investigating the use of an image analysis tool to measure the size of brain tumours.

As these machine learning methods save lawyers’ and medicos’ time, we will see their labour productivity rise along with a major shift in content of their work, and perhaps a reduction in the demand for lawyers and medicos. Handled sensibly by governments, this reduced demand will release workers for other occupations in for example, the creative, scientific and caring industries.

Professor Beth Webster

Pro Vice-Chancellor of Swinburne University (Research Policy and Impact) and Director, Centre for Transformative Innovation

Co-authored by:

machine learning

Dr Stephen Petrie, Data Scientist, Centre for Transformative Innovation 

machine learning

Mitchell Adams, Research Centre Manager, Centre for Transformative Innovation

Read next: Dr Bronwyn Evans, CEO of Standards Australia, traces the rise of blockchain technology and defines the framework needed to build trust in blockchain systems. 

Spread the word: Help Australia become digital savvy nation! Share this piece on digital disruptors using the social media buttons below.

More Thought Leaders: Click here to go back to the Thought Leadership Series homepage, or start reading the Women in STEM Thought Leadership Series here.