How a stint in Silicon Valley unleashed one researcher’s business skills
Tomasz Głowacki’s career now straddles academia and industry, thanks to his participation in a leadership programme organized by the Polish government.
In 2007, when I started work as a research and teaching assistant at Poznań University of Technology in Poland (a job that straddled bioinformatics research and teaching discrete mathematics, algorithms and data structures), I thought academia would be a lifelong career. I enjoyed the intellectual freedom, chance to work on challenging problems and travel opportunities.
Shortly after defending my computer-science PhD thesis in 2013, I secured a place on the Polish government’s Top 500 Innovators initiative, a nine-week programme in research commercialization and management at universities with high positions in the Academic Ranking of World Universities. It was set up because the Polish government thought a lack of cooperation between researchers and business was one of the main reasons for the country’s low position in European Innovation Scoreboard rankings.
The focus at my interview was how to commercialize my research results. I was asked about factors such as potential customers, business models and pricing. Two months later, I was one of 500 scientists sent either to the University of California, Berkeley; Stanford University, California; or the University of Cambridge, UK.
The goal was to learn from the very best researchers and business practitioners. While at the Walter A. Haas School of Business at Berkeley, I spent time with researchers, practitioners and entrepreneurs from Silicon Valley. What surprised me the most was the marriage between business and academic institutions in California.
Lecturers shared their experiences of research commercialization, business and start-up firms. This was very different from Poland, where a scientific career does not recognize commercial activities in terms of cooperation between business and academia. In my experience, many Polish scientists see commercialization activities as a roadblock to their academic careers.
During the Berkeley training, I heard how PhD students can successfully transition into business. These lectures were delivered by Peter Fiske, who is now director of the Water Energy Resilience Research Institute at Berkeley Lab, and whose career straddles both industry and academia. Fiske focused on transferable skills between academia and business, covering data analysis, resourcefulness, technological awareness, resilience, project management, problem solving, English proficiency and good written communication. Fiske is a strong advocate of the need to market yourself as a scientist. Mark Rittenberg, a business and leadership communications specialist at Haas School of Business, taught us about the power of communication and storytelling. As scientists, we focus mostly on research results. We tend to think that the content we present is enough to sell ourselves. But in business, how you present yourself, self-confidence, an interesting story and non-verbal communication are of at least the same importance.
The innovators programme included one-day visits to technology companies in Silicon Valley, and the opportunity to undertake internships at some of them. I visited Google, the software companies Splunk and Autodesk, as well as NASA and biotechnology firm Genentech. These visits helped me to understand that ambitious work and challenging problems are not just the domain of universities.
I did a three-week internship at PAX Water Technologies in Richmond, California, where I was one of five Polish scientists who set up an interdisciplinary team to work on reducing household water consumption. This was a long way from our research topics, and a new area for all of us. Willingness to learn new things, self-curiosity, creativeness and being open to unexplored areas helped us to drill down into the problem and to propose a solution. All of these are standard skills for a scientist.
The programme helped me to understand that scientists can be effective and successful outside academia, and that the business world is full of challenging problems to work on.
But the most important conclusion for me is that the applied aspect of what I do matters the most. The best fit for me seemed to be a transition into business.
Between June and September 2013, after completing the innovators programme, I applied for several research and development positions in business. I prepared a long CV that covered my research achievements. No one got back to me.
It was an important lesson. As scientists, we have to understand how our skills fit current job-market demands. So I connected with some old university friends who were working in business to discuss their interview experience.
I decided to revamp my CV by making the description of my education shorter and focusing on my transferable skills; I included organizational skills, experience of data-analysis techniques, language skills and my structured approach to problem solving.
As scientists we focus more on problems and solutions when we describe our work. But a potential business employer is more interested in how you get there. You should focus on the tools and methods you have used, knowledge of foreign languages, and how you organize and report your work.
In 2013, I found a job as an analyst at BAE Systems Applied Intelligence at its new offices in Poznań, working with IT systems and insurance data to detect customer fraud.
A year later, I discussed my transition with Fiske, who told me: “Now that you are on the other side, don’t lose touch with your friends in academia — seek ways to help them be more relevant to the outside world.”
I wanted to give something back and to find my own way to contribute to the academic world. I am now head of product development at Analyx, an international marketing data-analytics company, and also work part-time at Poznań School of Banking as a business practitioner, teaching project management as well as systems analysis and design. I discuss the real business cases I face with my students. I also organize lectures and meetings for students with business experts, chief executives and consultants. Some of these have started long-term academic collaborations, and they provide a great opportunity for students to learn from practitioners and to land internships. I have managed to organize a master’s programme between academia and business. Students have the chance to get involved in hot industry topics supervised by business experts, and to present results and defend their theses at their universities. Teaching based on my personal experience is more satisfying for me. Leaving academia was not a failure. It helped me to explore new opportunities, to better understand my professional expectations and to find the career path that fits me best.
This is an article from the Nature Careers Community, a place for Nature readers to share their professional experiences and advice. Guest posts are encouraged. You can get in touch with the editor at email@example.com.
Look west for resistance
With the most to lose from looming federal funding cuts, California's researchers take a stand. In December 2016, at a meeting of the American Geophysical Union, the governor of California, Jerry Brown, declared that if the new Trump administration stopped monitoring the Earth's climate with federal satellites, the Golden State would “launch its own damn satellites.” Brown's response followed earlier comments from a senior advisor to then president-elect Donald Trump proposing the elimination of funding for NASA's Earth science division. It was the first of many rumours, culminating in deep reductions to federal science spending in the president's proposed budget for 2018. The announcements have coincided with moves to restrict immigration, including a sweeping review of the visa programme used by research institutions to employ foreign scientists. “We've got scientists, we've got lawyers, and we're ready to fight,” said Brown to resounding applause from the crowd of climate scientists. But scientists in California are doing much more than cheering and clapping. Like Brown, they are using their political voice to challenge what they feel has been a gradual erosion of evidence-based policy-making. “From climate change to food scarcity to income inequality, we need people in office who can think creatively and use evidence to make decisions,” says Jess Phoenix, a geologist who studies active and extinct volcanoes across four continents. In April, she announced her decision to run for Congress to represent her home district north of Los Angeles. “We need scientists to take a stand,” she says. Cutting it close California, the most populous state in the US, has long been a science stronghold. With a weighted fractional count (WFC) of around 3,000, the research output of institutions in California in the Nature Index is nearly double that of its closest competitor, Massachusetts. For every 1,000 scientists and engineers working in the state in 2014, the United States Patent and Trademark Office granted it 45 patents — the highest in the country. Part of the state's research dominance can be explained by the large number of life, physical and social scientists employed in California — almost three times as many as in Massachusetts. In 2016, California received 15% of the total US allocation for the National Institutes of Health (NIH) and National Science Foundation (NSF), which was the largest share for any state, amounting to US$4.6 billion. But from its position at the pinnacle of research, California stands to lose more than any other state from the cuts to science funding proposed by the Trump administration. Trump's budget outline, released in May 2017, calls for slashing the spend by 18% for the NIH and by 11% for the NSF. California's losses would be likely to have far-reaching implications for the research output of the wider scientific community, given that many scientists in the state collaborate with peers across the country, and the world. In 2016, institutions in California formed more than 8,400 partnerships with counterparts in other states to co-author papers included in the index — the highest in the country. California's institutions also formed the most partnerships with institutions outside the US. Of course, a budget blueprint is just a president's wish list and an actual budget has to pass through Congress, which has largely rejected slashing funding for scientific research. The budget reconciliation for 2017 added money to federal science agencies. There is much trepidation among scientists about what cuts will pass Congress. “We are in a period of significant uncertainty,” says Randolph Hall, vice president of research at the University of Southern California (USC) in Los Angeles. Jess Phoenix leads educational non-profit, Blueprint Earth, and is running for Congress. If federal funding is cut, California researchers will be looking for more money from the state's budget, foundations and industry. Corporate funding currently makes up about 5% of research money, and private foundations 5–10%. State funding ranges from 2% or less at private institutions like USC, up to 11% at the public University of California system. “While we might hope for these funds to rise in the future, it won't ever come close to the amount from federal funding agencies,” says Hall, who is also the incoming president of the University Industry Demonstration Partnership, an organization that enables interactions between industry and academia. People politics Research also requires a reliable supply of talented people. Universities are concerned that reviews of visa policies, such as the 90-day ban on travellers from six Muslim-majority nations, and the more recent restrictions on visitors from a revised list of seven countries may affect their ability to attract and retain the world's best researchers. When Trump's travel ban first went into effect in January 2017, Giovanni Peri, an economist at the University of California, Davis, was considering a candidate from Iran, one of the countries on the banned list, for a professor position. The selection panel decided that a different candidate was better qualified, but the administration's announcement raised many concerns about whether the suspension on travel barred them from hiring an Iranian. Reforms to the H-1B visa for highly skilled foreign workers could also hinder university recruitment. Universities in California employed more than 3,000 H-1B visa workers in 2015, according to the Office of Foreign Labor Certification. H-1B visas are becoming even more important for universities because fewer US citizens and permanent residents are pursuing advanced degrees in science. In 2014, 25% of the students enrolled in graduate programmes in the US were temporary residents, compared to 21% in 2000. “Universities would be impoverished and the ability to hire scientists would be reduced if the programme changed,” says Peri. In an analysis of US metro areas between 1990 and 2010, he found a 1% increase in the number of foreigners filling scientific and technical positions increased the average income of college-educated native workers by 5–6% in that area. The H-1B visa programme does not appear to be at immediate risk. But processing times have lengthened since the Department of Homeland Security suspended fast-track processing of H1-B applications in April 2017. State-level collaboration In 2016, institutions in California formed close to 9,500 bilateral partnerships with institutions across the country to co-author papers included in the index. The top 20 states that California institutions formed research links with are ranked by the number of bilateral partnerships. Global research hub California is the most collaborative state in the United States, forming the most domestic and international bilateral institutional partnerships. The top 10 most collaborative states in the country are ranked by their total number of bilateral partnerships. Run, scientists, run The current political climate has inspired some Californian researchers to look beyond the lab. Following the 2016 election, Phoenix found herself drawn into politics. She was dismayed to learn that her congressional representative, a member of the House Science Committee, does not believe that the federal government should regulate greenhouse gas emissions. In April 2017, she decided to challenge for the seat in the upcoming 2018 midterm elections. Days later, she spoke at a March for Science rally in Los Angeles defending scientific research and informed decision-making. “I'm 35. No-one else is going to get involved politically for me,” says Phoenix, who runs an educational non-profit called Blueprint Earth and is a fellow at New York-based professional science society, The Explorers Club. “Scientists have been shocked by the incompetence at every level of elected office.” Kevork Abazajian, a physicist studying the origins of the Universe at the University of California, Irvine, is also considering a run for city council — a local office. He hopes to get the town of Irvine to take more action on climate change, for one thing. “After the November election, scientists have been shocked by the degree of incompetence at almost every level of elected office,” he says. “There is a history of scientists going into elected office in other countries, and that's what we need more of.” Abazajian is also the California coordinator for 314 Action, a non-profit group that supports science-savvy candidates and policies. Since January, the group (whose name comes from the value of the mathematical constant π) has organized two training sessions in Washington DC and California for scientists interested in running for office. Training included fundraising and crafting a message that sticks with voters. “You have to be a good messenger,” says Abazajian. 314 Action has also supported stem cell researcher, Hans Keirstead, in California, along with volcanologist Phoenix, in their bids for Congress in 2018. Adding more scientists would shake up the decision-making process: currently only one of the 535 representatives and senators is a practising scientist with a doctoral degree — physicist, Bill Foster, of Illinois. “When California leads, the world follows,” says Phoenix. “Now, more than ever, we are called to bring truly representative democracy to the fore.” Search our job roles in California
Is Science In Trouble?
A Conversation With Colin Camerer About the Replication Crisis News Writer: Emily Velasco Credit: Caltech If there's a central tenet that unites all of the sciences, it's probably that scientists should approach discovery without bias and with a healthy dose of skepticism. The idea is that the best way to reach the truth is to allow the facts to lead where they will, even if it's not where you intended to go. But that can be easier said than done. Humans have unconscious biases that are hard to shake, and most people don't like to be wrong. In the past several years, scientists have discovered troubling evidence that those biases may be affecting the integrity of the research process in many fields. The evidence also suggests that even when scientists operate with the best intentions, serious errors are more common than expected because even subtle differences in the way an experimental procedure is conducted can throw off the findings. When biases and errors leak into research, other scientists attempting the same experiment may find that they can't replicate the findings of the original researcher. This has given the broader issue its name: the replication crisis. Colin Camerer Colin Camerer, Caltech's Robert Kirby Professor of Behavioral Economics and the T&C Chen Center for Social and Decision Neuroscience Leadership Chair, executive officer for the Social Sciences and director of the T&C Chen Center for Social and Decision Neuroscience, has been at the forefront of research into the replication crisis. He has penned a number of studies on the topic and is an ardent advocate for reform. We talked with Camerer about how bad the problem is and what can be done to correct it; and the "open science" movement, which encourages the sharing of data, information, and materials among researchers.What exactly is the replication crisis? What instigated all of this is the discovery that many findings—originally in medicine but later in areas of psychology, in economics, and probably in every field—just don't replicate or reproduce as well as we would hope. By reproduce, I mean taking data someone collected for a study and doing the same analysis just to see if you get the same results. People can get substantial differences, for example, if they use newer statistics than were available to the original researchers. The earliest studies into reproducibility also found that sometimes it's hard to even get people to share their data in a timely and clear way. There was a norm that data sharing is sort of a bonus, but isn't absolutely a necessary part of the job of being a scientist.How big of a problem is this? I would say it's big enough to be very concerning. I'll give an example from social psychology, which has been one of the most problematic areas. In social psychology, there's an idea called priming, which means if I make you think about one thing subconsciously, those thoughts may activate related associations and change your behavior in some surprising way. Many studies on priming were done by John Bargh, who is a well-known psychologist at Yale. Bargh and his colleagues got young people to think about being old and then had them sit at a table and do a test. But the test was just a filler, because the researchers weren't interested in the results of the test. They were interested in how thinking about being old affected the behavior of the young people. When the young people were done with the filler test, the research team timed how long it took them to get up from the table and walk to an elevator. They found that the people who were primed to think about being old walked slower than the control group that had not received that priming. They were trying to get a dramatic result showing that mental associations about old people affect physical behavior. The problem was that when others tried to replicate the study, the original findings didn't replicate very well. In one replication, something even worse happened. Some of the assistants in that experiment were told the priming would make the young subjects walk slower, and others were told the priming would make them walk more quickly—this is what we call a reactance or boomerang effect. And what the assistants were told to expect influenced their measurements of how fast the subjects walked, even though they were timing with stopwatches. The assistants' stopwatch measures were biased compared to an automated timer. I mention this example because it's the kind of study we think of as too cute to be true. When the failure to replicate came out, there was a big uproar about how much skill an experimenter needs to do a proper replication.You recently explored this issue in a pair of papers. What did you find? In our first paper, we looked at experimental economics, which is something that was pioneered here at Caltech. We took 18 papers from multiple institutions that were published in two of the leading economics journals. These are the papers you would hope would replicate the best. What we found was that 14 out of 18 replicated fairly well, but four of them didn't. It's important to note that in two of those four cases, we made slight deviations in how the experiment was done. That's a reminder that small changes can make a big difference in replication. For example, if you're studying political psychology and partisanship and you replicate a paper from 2010, the results today might be very different because the political climate has changed. It's not that the authors of the original paper made a mistake, it's that the phenomenon in their study changed. In our second paper, we looked at social science papers published between 2010 and 2015 in Science and Nature, which are the flagship general science journals. We were interested in them because these were highly cited papers and were seen as very influential. We picked out the ones that wouldn't be overly laborious to replicate, and we ended up with 21 papers. What we found was that only about 60 percent replicated, and the ones that didn't replicate tended to focus on things like priming, which I mentioned before. Priming has turned out to be the least replicable phenomenon. It's a shame because the underlying concept—that thinking about one thing elevates associations to related things—is undoubtedly true.How does something like that happen? One cause of findings not replicating is what we call "p-hacking." P-value is a measure of the statistical likelihood that your hypothesis is true. If the p-value is low, an effect is highly unlikely to be a fluke due to chance. In social science and medicine, for example, you are usually testing whether changing the conditions of the experiment changes behavior. You really want to get a low p-value because it means that the condition you changed did have an effect. P-hacking is when you keep trying different analyses with your data until you get the p-value to be low. A good example of p-hacking is deleting data points that don't fit your hypothesis—outliers—from your data set. There are statistical methods to deal with outliers, but sometimes people expect to see a correlation and don't find much of one, for example. So then they think of a plausible reason to discard a few outlier points, because by doing that they can get the correlation to be bigger. That practice can be abused, but at the same time, there sometimes are outliers that should be discarded. For example, if subjects blink too much when you are trying to measure visual perception, it is reasonable to edit out the blinks or not use some subjects. Another explanation is that sometimes scientists are simply helped along by luck. When someone else tries to replicate that original experiment but doesn't get the same good luck, they won't get the same results.In the sciences, you're supposed be impartial and say, "Here's my hypothesis, and I'm going to prove it right or wrong." So, why do people tweak the results to get an answer they want? At the top of the pyramid is outright fraud and, happily, that's pretty rare. Typically, if you do a postmortem or a confessional in the case of fraud, you find a scientist who feels tremendous pressure. Sometimes it's personal—"I just wanted to be respected"—and sometimes it's grant money or being too ashamed to come clean. In the fraudulent cases, scientists get away with a small amount of deception, and they get very dug in because they're really betting their careers on it. The finding they faked might be what gets them invited to conferences and gets them lots of funding. Then it's too embarrassing to stop and confess what they've been doing all along.There are also faulty scientific practices less egregious than outright fraud, right? Sure. It is the scientist who thinks, "I know I'm right, and even though these data didn't prove it, I'm sure I could run a lot more experiments and prove it. So I'm just going to help the process along by creating the best version of the data." It's like cosmetic surgery for data. And again, there are incentives driving this. Often in Big Science and Big Medicine, you're supporting a lot of people on your grant. If something really goes wrong with your big theory or your pathbreaking method, those people get laid off and their careers are harmed. Another force that contributes to weak replicability is that, in science, we rely to a very large extent on norms of honor and the idea that people care about the process and want to get to the truth. There's a tremendous amount of trust involved. If I get a paper to review from a leading journal, I'm not necessarily thinking like a police detective about whether it's fabricated. A lot of the frauds were only uncovered because there was a pattern across many different papers. One paper was too good to be true, and the next one was too good to be true, and so on. Nobody's good enough to get 10 too-good-to-be-trues in a row. So, often, it's kind of a fluke. Somebody slips or a person notices and then asks for the data and digs a little further.What best practices should scientists follow to avoid falling into these traps? There are many things we can do—I call it the reproducibility upgrade. One is preregistration, which means before you collect your data, you publicly explain and post online exactly what data you're going to collect, why you picked your sample size, and exactly what analysis you are going to run. Then if you do very different analysis and get a good result, people can question why you departed from what you preregistered and whether the unplanned analyses were p-hacked. The more general rubric is called open science, in which you act like basically everything you do should be available to other people except for certain things like patient privacy. That includes original data, code, instructions, and experimental materials like video recordings—everything. Meta-analysis is another method I think we're going to see more and more of. That's where you combine the results of studies that are all trying to measure the same general effect. You can use that information to find evidence of things like publication bias, which is a sort of groupthink. For example, there's strong experimental evidence that giving people smaller plates causes them to eat less. So maybe you're studying small and large plates, and you don't find any effect on portion size. You might think to yourself, "I probably made a mistake. I'm not going to try to publish that." Or you might say, "Wow! That's really interesting. I didn't get a small-plate effect. I'm going to send it to a journal." And the editors or referees say, "You probably made a mistake. We're not going to publish it." Those are publication biases. They can be caused by scientists withholding results or by journals not publishing them because they get an unconventional result. If a group of scientists comes to believe something is true and the contrary evidence gets ignored or swept under the rug, that means a lot of people are trying to come to some collective conclusion about something that's not true. The big damage is that it's a colossal waste of time, and it can harm public perceptions of how solid science is in general.Are people receptive to the changes you suggest? I would say 90 percent of people have been very supportive. One piece of very good news is the Open Science Framework has been supported by the Laura and John Arnold Foundation, which is a big private foundation, and by other donors. The private foundations are in a unique position to spend a lot of money on things like this. Our first grant to do replications in experimental economics came when I met the program officer from the Alfred P. Sloan Foundation. I told him we were piloting a big project replicating economics experiments. He got excited, and it was figuratively like he took a bag of cash out of his briefcase right there. My collaborators in Sweden and Austria later got a particularly big grant for $1.5 million to work on replication. Now that there's some momentum, funding agencies have been reasonably generous, which is great. Another thing that's been interesting is that while journals are not keen on publishing a replication of one paper, they really like what we've done, which is a batch of replications. A few months into working on the first replication paper in experimental economics funded by Sloan, I got an email from an editor at Science who said, "I heard you're working on this replication thing. Have you thought about where to publish it?" That's a wink-wink, coy way of saying "Please send it to us" without any promise being made. They did eventually publish it.What challenges do you see going forward? I think the main challenge is determining where the responsibility lies. Until about 2000, the conventional wisdom was, "Nobody will pay for your replication and nobody will publish your replication. And if it doesn't come out right, you'll just make an enemy. Don't bother to replicate." Students were often told not to do replication because it would be bad for their careers. I think that's false, but it is true that nobody is going to win a big prize for replicating somebody else's work. The best career path in science comes from showing that you can do something original, important, and creative. Replication is exactly the opposite. It is important for somebody to do it, but it's not creative. It's something that most scientists want someone else to do. What is needed are institutions to generate steady, ongoing replications, rather than relying on scientists who are trying to be creative and make breakthroughs to do it. It could be a few centers that are just dedicated to replicating. They could pick every fifth paper published in a given journal, replicate it, and post their results online. It would be like auditing, or a kind of Consumer Reports for science. I think some institutions like that will emerge. Or perhaps granting agencies, like the National Institutes of Health or the National Science Foundation, should be responsible for building in safeguards. They could have an audit process that sets aside grant money to do a replication and check your work. For me this is like a hobby. Now I hope that some other group of careful people who are very passionate and smart will take up the baton and start to do replications very routinely.
Start-ups: A sense of enterprise
Universities aid entrepreneurs by helping them to turn their research into companies. In return, universities can reap financial benefits. Michael Schrader knew he wanted to create a company, but he wasn't sure what it should do. After six years as a mechanical engineer in the automotive industry building plastic parts, in 2010 he began a master's degree in business administration at Harvard Business School in Boston, Massachusetts. In his quest for inspiration, he took a course in commercializing science at the Harvard Innovation Lab (i-lab). The class heard presentations from researchers who among them had developed 17 different technologies that they thought had commercial value. One in particular caught Schrader's attention — a method devised by two engineers from Tufts University that uses a silk protein to stabilize vaccines. The vaccines could be formulated as powders and mixed with water when it was time to inject them, or embedded into a film that dissolves on the tongue like a breath-freshening strip. And, because they would not need to be refrigerated, they would be easier than conventional vaccines to distribute in places such as sub-Saharan Africa. Along with other members of his class — an economics master's student, a former physics student earning a law degree and a postdoc in the chemistry department — Schrader spent the next few months looking into potential markets for the technology, making connections with business mentors and investors, and putting together a business plan. In 2012, the team founded Vaxess Technologies, which is attempting to bring vaccine formulations to market. “We probably are a perfect model for how universities can forge together entrepreneurs and technologies to create companies,” says Schrader, now chief executive of Vaxess. The technology has not yet entered clinical testing, but the company has raised more than US$5 million, hired 11 employees, and started filing patents of its own in addition to those it licensed from Tufts University. Although universities often license technology developed in their research laboratories to existing companies that are looking for new products, they also move discoveries off the bench and into the real world by encouraging inventors to start businesses from scratch. They offer classes in entrepreneurship, introduce researchers to investors and business experts, and even launch their own venture-capital funds. The path is trickier for life-sciences spin-offs, which take more time and money to get off the ground, than for companies based on software or electronics. And Europe has not caught up with the United States in its ability to create businesses. But universities are banking on entrepreneurs turning some of their research into products (see 'Start-up sampler'). Table 1: Start-up sampler Universities seeking to commercialize research spin off scores of companies. These examples show the range of entrepreneurship spawned in the life sciences. Full size table Hubs of innovation “We exist on taxpayer money. We have an obligation to try to get our research out into society.” Universities tend to see commercialization as part of their remit to create and disseminate knowledge. “We exist on taxpayer money. We have an obligation to try to get our research out into society,” says Regis Kelly, director of the California Institute for Quantitative Biosciences known as QB3. The institute is a collaboration between the Berkeley, Santa Cruz and San Francisco campuses of the University of California. It supports life-sciences research across the campuses and tries to bring that research to market by partnering with industry and promoting entrepreneurship. Part of the mission of the University of Colorado Boulder's BioFrontiers Institute is to aid students and faculty members who want to start new companies, says Jana Watson-Capps, associate director of the institute. “It fits with what we want to do in providing an education for our students so that they can find jobs and be good at those jobs,” she says. A similar attitude is common in the United Kingdom. “We think it's important here in Oxford to see that the fruits of our research are actually developed to benefit society,” says Linda Naylor, managing director of Isis Innovation, a company created by the University of Oxford to commercialize its research. Harvard's i-lab, which was opened in late 2011 to help students in any of the university's schools to develop businesses, is a relatively new entry in a long line of such efforts at many academic institutions. Students learn about idea generation, business-plan development and marketing. Budding entrepreneurs can attend workshops on specific hurdles that they are likely to encounter, such as how to apply for a Small Business Innovation Research grant from the federal government. A group of 'experts in residence' provides students with business expertise and introduces them to potential investors. The i-lab holds competitions such as the President's Challenge, which awards ideas that address the world's big problems. Vaxess took the challenge's top prize of $70,000 in 2012, as well as winning $25,000 in Harvard's Business Plan Contest the same year. Because the main thrust of the i-lab is education, the university never takes a stake in any of the companies created there, says managing director Jodi Goldstein. Any intellectual property developed in a Harvard research lab belongs to the university and must be licensed, but ideas generated in the i-lab belong to the students. Goldstein hopes that the i-lab can help a future Mark Zuckerberg or Bill Gates to pursue their billion-dollar idea while still completing their degree. “We have several pretty famous dropouts around here, and I don't think that's necessary anymore,” she says. As well as education and expertise, the i-lab provides a workspace for fledging companies. Meeting rooms, computer workstations and private storage space are available, as are a workshop for building prototypes and a pair of 3D printers. The i-lab is also planning to address one of the stumbling blocks that often trips up biology-based companies: finding a space to turn a discovery made in a university lab into a more marketable version. It is building a 1,400-square-metre wet lab with 36 research benches. When Vaxess reached that stage, it moved to LabCentral in Cambridge, Massachusetts. The provider of office and laboratory space takes care of regulatory requirements and provides administrative support and laboratory personnel so that new companies don't have to spend time and money setting up their own space. It opened in 2013 with a $5-million grant from the Massachusetts government (part of an initiative to bolster life-sciences business in the state) along with support from the Massachusetts Institute of Technology and the venture-capital arm of health-care giant Johnson & Johnson. Schrader considers this industry–government–academia web of support essential to his company's launch. “We have really taken advantage of this growing entrepreneurial ecosystem,” he says. At QB3 in California, start-ups can rent lab space for as little as $85–100 per square metre per month. Unlike conventional landlords, who prefer to rent out an entire space, start-ups can rent a few hours in a fume cupboard or a shelf in a freezer, for example. “You only pay for what you actually use,” Kelly says. Charging is important, mainly because it is a way of weaning its users off the university teat. “It gets people more used to being in the private sector,” he says. The need for lab space is just one reason why starting a life-sciences company can be much more challenging than, say, launching a business based on software. Any sort of pharmaceutical or medical device is subject to regulatory requirements, which leads to safety tests and clinical trials “If you're going to make a new drug you might need ten years and a billion dollars,” says Watson-Capps. These time and capital requirements make it much more difficult to drum up investment for a life-sciences start-up. Although investors might be willing to risk a couple of hundred thousand dollars on a promising software idea, most life-sciences companies need initial funding of a few million dollars. “Obviously, people don't want to throw away a million dollars, so they have to do a lot more due diligence,” Kelly says. And because the time to realize a return on the investment can be so long, trading equity in the company in exchange for, say, legal services is not as popular as it is for other types of start-ups, he adds. These disparities are apparent in the investment statistics. Of the $77.3 billion in venture capital invested in the United States in 2015, software companies took in $31.2 billion — 40% of the total. Pharmaceuticals and biotechnology received a mere 12%. Playing catch up Europe lags behind the United States in producing start-ups of any kind, but the situation is improving. “We're certainly seeing a lot more spin-outs than we were a few years ago,” says Naylor. “There is more money around that is willing to go into the early stage.” Vaxess Technologies are using silk proteins (L), which are extracted from cocoons (R), to stabilize vaccines. Image: Patrick Ho/Vaxess She attributes that growth, in part, to the UK government's creation of the Seed Enterprise Investment Scheme in 2012, which provides tax breaks to investors in start-up companies. “The UK has been one of the leaders in providing tax incentives for investors in start-ups of all types,” says Karen Wilson, who studies entrepreneurship and innovation at Bruegel, an economic think tank in Brussels. Other countries across Europe, as well as Australia, have created their own tax incentives for investors modelled on the British scheme, although Wilson says that they're often controversial, derided as tax breaks for the wealthy. In the United States, tax incentives vary by state. The biggest legal change in the United States to promote spin-offs came in 1980, Wilson says, with the passage of the Bayh–Dole act, which allowed researchers to profit from inventions created with federal funding. US and UK Universities have even been creating their own venture funds in recent years to invest in their spin-offs. The University of Cambridge, UK, created Cambridge Innovation Capital in 2013 with an initial fund of £50 million ($71 million). In 2014, the University of California began a $250-million fund. In May 2015, Isis launched Oxford Sciences Innovation to raise an initial £300 million from investors. And, in January, University College London opened the £50 million UCL Technology Fund, and the University of Bristol, UK, started its own enterprise fund (see 'Innovation income'). Box 1: Licensing technology: Innovation income When it comes to commercializing research, universities often emphasize their desire to spread their discoveries, but they also reap financial rewards from licensing technology and investing in spin-off companies. Isis Innovation, for instance, took in £24.6 million (US$34.9 million) in revenue in 2015, of which it returned £13.6 million to its founder Oxford University, UK, more than double 2014's £6.7 million. The university also earned more than £30 million in cash and stocks from the 2014 sale of the games and technology company NaturalMotion (in which it had a stake of about 9%) to Zynga in San Francisco, California, for $527 million. NaturalMotion was co-founded in 2001 by Torstein Reil, then a PhD student in Oxford's zoology department studying neural systems. Reil used his research to create computer simulations that more accurately mimic how animals move, and turned them into a company that makes popular games such as Clumsy Ninja. But licensing income tends to make up only a small part of a university's revenue stream. Harvard University in Cambridge, Massachusetts, which last year issued 50 licenses to patents it owns and saw 14 firms started on the basis of its technology, had licensing revenue of $16.1 million in 2015. But that is a fraction of Harvard's 2015 budget of nearly $4.5 billion, of which the university spent $876 million on research. Jana Watson-Capps, associate director of the University of Colorado Boulder's BioFrontiers Institute, says that income from all licensing — not just from spin-off companies — is valuable to the university and goes back into funding research. However, she adds, licensing income is relatively small and comes so long after the initial investment that it's not a major consideration at the institute. A similar attitude prevails at Oxford. Although the university welcomes the licensing income, it's not the only motive for promoting spin-offs, says Linda Naylor, managing director of Isis Innovation. “The university is very clear it wants to create impact,” she says. “They're not there to make any quick money.” Show more Entrepreneurial ecosystems in which inventors can find facilities, investors and business experts to help them to launch their companies are important for creating successful spin-offs, and they've been growing around many European universities, Wilson says. “There are an increasing number of these entrepreneurial hubs that are emerging across Europe, which are spawning these innovative high-growth firms,” she says. In the United Kingdom, Cambridge is popular for life-sciences start-ups, and in Munich, Germany, the focus is mobile technology. In Switzerland, start-ups are clustered around the University of Zurich and the Swiss Federal Institute of Technology in Lausanne, where they focus on computing and technology. In Finland, Espoo is a hub: in 2010, three institutions combined to form Aalto University, which has strengths in communications, energy and design. Linked by a bridge across the Øresund strait, Copenhagen and Malmo in Sweden, make up another life-sciences centre. In the past year, however, the influx of refugees from the Middle East has led to a tightening of border security and made crossing the bridge more difficult for everyone. The clampdown on migration within Europe, says Wilson, is making it harder for fledging companies to grow and spread. Expansion of their markets has always been challenging for start-ups in Europe, she says, where pushing into another country means dealing with differences not only in language and culture but also in taxes and other regulations. Many European companies get to a point at which, when they need to grow into a bigger market, they move to the United States, either of their own accord or at the insistence of their investors. “If you have a successful start-up in Italy it's much easier to go scale it in the US than it is to try to scale it across Europe,” Wilson says. But many life-sciences companies won't grow on their own, particularly if their innovation is a drug — their endgame is often to be acquired by a large pharmaceutical company once they have advanced their therapy to a promising stage. Although life-sciences companies demand more resources than other types of start-up, they have one characteristic that can make them uniquely appealing to investors — the potential for curing a disease or improving human health. As Kelly points out, “Almost any rich person has a sick relative.” If investors are going to risk their money, knowing that many of the companies they invest in will fail, they may prefer investments that have a potential for making a difference, he says. “If they're going to lose money on a business, they might as well lose it on something that could have some benefit to society.” Search our job roles in California