Measuring the efficiency of university technology transfer

BYLINE: Timothy R. Anderson, tima@etm.pdx.edu; Tugrul U. Daim ?, tugrul@etm.pdx.edu; Francois F. Lavoie

Universities provide education as well as innovations resulting from their research. This paper focuses on the service of transferring research results into other sectors. Many stakeholders such as academic researchers, technology transfer offices (TTOs) and private industry are involved in technology transfer which calls for a comprehensive approach. A data envelopment analysis (DEA) approach is used as a productivity evaluation tool applied to university technology transfer. The methodology included weight restrictions providing a more comprehensive metric. The results include an examination of efficiency targets for specific universities as well as peer count of inefficient universities. Evidence of significant efficiency in university technology transfer is found in many leading universities. An examination of differences between public versus private universities and those with medical schools and those without indicated that universities with medical schools are less efficient than those without.

1 Introduction

Commercial activity based on technology licensing has become a prominent and very lucrative business for colleges in the US and around the world. As an example, US universities that responded to the Association of University Transfer Managers (AUTM) survey for fiscal year 2004 reported total revenues of 2.51 billion based on income received from licenses/options and running royalties from product sales (AUTM 2004). By contrast, the same institutions spent 41.24 billion in sponsored research, 67% of which was sponsored by the Federal Government (AUTM 2004).

Many stakeholders such as academic researchers, technology transfer offices (TTOs) and private industry are involved in the technology transfer process. Among those three players, TTOs are considered by many to be key stakeholders to determine a university's overall success at this business process. The primary role of a TTO is to manage and perform technology transfer activities (AUTM 2004). Many studies have shown that a great deal of these TTOs operate inefficiently. Some studies have been conducted to understand the underlying deficiencies.

University technology transfer has gained increased attention during the 15 years since AUTM started publishing surveys of the universities in 1991. Figs. 1 and 2 summarize the results of the recent surveys.

Fig. 2 shows that license income and invention disclosures increased at a rate similar to sponsored research between 2001 and 2004. When we try to assess the ratio of each output, then we start to question the effectiveness of university technology transfer. A simple calculation of ratios of research expenditures per invention disclosure and licensing income dollars may at first glance lead a sceptic to question the effectiveness of university technology transfer. Heher (2006) provides a forecast of the income through university innovations. His finding of expected exponential increase also justifies exploration of the field.

This issue of efficiency has been explored by using different methods as described later in the literature review. The paper has the following objectives:






  • 1. To present exploratory work to assess and confirm the relevancy of DEA as a productivity evaluation tool.




  • 2. To examine the efficient and inefficient TTOs within US universities as well as alternative DEA methods to assign weights to certain outputs in terms of the relative importance of these outputs.




  • 3. To examine if there is a correlation between university efficiency and the existence of a medical school using linear regression. In the same manner, we will investigate whether private universities are more efficient than their public counterparts in terms of technology transfer.




2 Literature search

University research and its transfer to industry has been a topic of interest in the management of technology literature over decades. We see the literature grouped under the following titles:Organizational structures.Regional or international comparisons/case studies.Impacts of university research.Tangible outputs of university research (patents, licenses, spin-offs).Efficiency of university research transfer.

Several researchers focused on the organizational issues. Siegel et al. (2003) explored such organizational structures of the TTOs linking them to their productivity suggesting that the most critical organizational factors for productivity TTOs in research universities are faculty reward systems, TTO staffing/compensation practices, and cultural barriers between universities firms. Rasmussen et al. (2006) explored initiatives provided by the universities to promote commercialization of university knowledge and identified coordination a challenge. McAdam et al. (2005) provide such a coordination model for university innovation centers. They analyze licensing and business building processes. Chapple et al. (2005) indicated that there is a need to increase business skills and management capabilities to TTOs. Thursby and Kemp (2002) also explored efficiency of university technology transfer by looking at the organizational issues. Siegel and colleagues studied similar issues (Siegel et al., 2003, 2004) also studied similar issues. Their focus has been the impact of organizational characteristics and the implications for education. They make recommendations based on the barriers identified in the UTT efficiency and effectiveness processes such as culture clashes, bureaucratic inflexibility, poorly designed reward systems, and ineffective management of TTOs. Lowe (2006) proposes a theoretical model to illustrate how the inventor know-how affects whether the inventor starts a firm to develop her idea or licenses an invention to an established firm for development. This model is then used to analyze the role and impact of a university TTO on this process to understand how TTOs may both positively and negatively impact the transaction. Leitch and Harrison (2005) explored the dynamics of the spin-off phenomenon with a focus on the TTO and they propose a wider role for such offices to be more efficient. Lopez (1998) explored different ways universities can get organized to improve the research efficiency. This group of literature supports our hypothesis that there are efficiency issues while transferring technology out of the university environment.

We also see studies comparing different approaches or regions Goldfarb and Henrekson (2003) and Feldman et al. (2002) studied different policies for transferring university technology. Di Gregorio and Shane (2003) explored differences among universities in commercialization of technologies. Colyvas et al. (2002) studied case studies of commercialization of university inventions. Lee and Win (2004) explored three university research centers in Singapore concluding that coordination among university center, industry and government is one of the key success factors. Owen-Smith et al. (2002) compared US and European practices in terms of university industry relations. Other studies focused on individual cases to explore similar issues. Zucker et al. (2002) looked at the efficiency of university technology transfer through a biotechnology case study. Lopez-Martinez et al. (1994) found out that in developing countries specifically in Mexico both academia and industry have implicit cultural dissimilarities which directly affect current or potential cooperative liaisons. The industry-academic interdepencies in Germany have been well studied (Meyer-Krahmer and Schmoch, 1998; Beise and Stahl, 1999). Their research findings indicate that there are certain requirements to be met by both parties to have successful long term collaborations. Boyle (1986) focused on the technology transfer between universities and the UK offshore industry; Corsten (1987) reviewed industry-university collaborations in 225 enterprises; and Goldhor and Lund (1983) provided a detailed analysis of transfer of a text to speech reading machine from MIT into industry. This group of literature verifies the efficiency issue further by adding another dimension of variance. We see that organizational, cultural and regional differences can make a difference.

Some other studies focused on the impact of university research. Feller et al. (2002) and Cohen et al. (2002) specifically explored the impact of university research on industrial innovation. Shane and Stuart (2002) studied the resulting start ups through university research. Siegel et al. (2003) concluded that science university parks do not have significant impact on research productivity. Bennet et al. (1998) focused on university-industry collaboration for technology transfer in poorer regions of the United Kingdom. Such collaborations are reported to be successful and help local economies.

Studies that focused on exploring the efficiency through studying their tangible output are found frequently in relevant literature. Trune and Goslin (1998) studied performance of the TTOs from a profit/loss analysis perspective. Their results indicate that such centers are profitable and are acting as significant economic drivers. Berman (1990) also provided evidence on the economic impact of industry funded university R&D. Several studies (Agrawal and Henderson, 2002; Mowery et al., 2002; Shane, 2002) have specifically explored patenting within the universities. Geuna and Nesta (2006) fear that the increase in university patenting exacerbates the differences across universities in terms of financial resources and research outcome. Also, because of international property regulations (IPRs) there is a tendency for universities and academics to limit disclosure of materials and information, therefore helping to foster growing commercialism and competition among universities and dampen open science and knowledge transfer (Sampat, 2006). Mazzoleni (2006) presents a model of R&D competition based on a university invention where appropriability conditions are defined by the patentability of downstream innovations and imitation opportunities. He concludes that university licensing royalties are therefore a poor gauge of social welfare gains from university patenting.

Among the studies focusing on the tangible outputs of university technology transfer, several focused on spin-off companies. Perez Perez and Sanchez (2003) explored development of university spin-offs, thereby explaining the early dynamics of technology transfer and networking. Libaers et al. (2006) also examine the role of university spin-out (USO) companies in the emergence of a new technology, in our case nanotechnology. They conclude that USOs are important contributors to technological change in specific subfields of nanotechnology, but that other actors, notably large firms and new technology-based firms, are even more significant agents of technological change. Niosi (2006) found that the growing companies of the 2000s are most often not in biotechnology, in spite of their frequent support by venture capital. Conversely, spin-off companies that grew had often obtained patents and received support from the Industrial Research Assistance Program, a support program for R&D in smaller firms managed by the National Research Council of Canada. Ndonzuau et al. (2002) also explored spin-off creation and proposed a stage model with clear expectations at each stage.

The group of studies focusing on the impacts and tangible outputs of university research verified our approach of using tangible outputs to measure the efficiency of university technology transfer.

A series of studies built models to establish efficiency metrics and frameworks. Thursby and colleagues focused on university licensing efficiency in various forms (Thursby et al., 2001; Thursby and Kemp, 2002; Thursby and Thursby, 2003, 2004). Siegel and Phan (2004) described data envelopment analysis (DEA) and stochastic frontier estimation (SFE) as the two most widely used tools to carry out this evaluation. In the same paper, these authors also presented a comprehensive list of publications and associated methodologies on the subject of technology transfer assessment. Earlier, Thursby and colleagues had applied DEA to similar problems (Thursby and Kemp, 1998; Thursby and Thursby, 2002). Powers (2003) used regression to analyze the relationship between research expenditure and resulting tangible output. The efficiency of university TTOs can be measured in numerous ways. The simplest method would be to rank universities purely based on licensing income. For instance, The Chronicle of Higher Education (Blumenstyk, 2005) published an article discussing technology transfer which presented a table ranking universities based on licensing revenues. This ranking of the top 54 universities (establishments that generated at least 2 million in licensing revenues) was based on data obtained from the AUTM US Licensing Survey for fiscal year 2004. Other tools can and have been used in the past to evaluate the university licensing process. This group of literature verified the use of DEA to explore organizational characteristics such as existence of a medical school or being a private or a public university.

3 Methodology

3.1 Relative weight of certain outputs

A strength of DEA is that it allows for the relative weight of each output to vary for each university. Given the varying industries engaged by different universities, the importance of each output may also vary. It is therefore quite difficult to make generalizations about the relative weight or value of a startup compared to licensing income. On the other hand, one relationship stands out-an issued patent is far more valuable than a mere patent application. We modeled a relative weight restriction to ensure that an issued patent had at least five times the weight of an applied patent. Additional relationships could be incorporated in future work.

3.2 Caveats

Time lag between factors was recognized by earlier authors (Thursby and Thursby, 2002) and dealt with by using data average over a six year period. We took advantage of a strength of DEA and the nature of this problem to keep the original data. Conceptually, we could have included preceding years as separate inputs to allow for separate time lags between research expenditures and the technology transfer outcomes. DEA scores are robust with respect to the inclusion of highly correlated inputs. We examined correlations between years and found extremely high correlations as shown in Table 1 ranging from a low of 98.0% for 2001-2004 and a high of 99.8% for 2001-2002. On this basis the results would be very similar whether we chose to use 2001 research expenditures rather than 2004 as an input, some weighted combination of R&D expenditures from different years, or even included multiple years as separate inputs.

Eight universities responded anonymously to the 2004 licensing survey and therefore were not included in the data set and analysis. In particular, Blumenstyk (2005) discussed Columbia University, which generated more than 116 million of licensing revenue and listed on its web site the year 2004 licensing revenue but chose to respond to the AUTM survey anonymously. Columbia University would have had the highest licensing revenue among all universities and therefore would have been deemed efficient. Including the other anonymous responses may affect results as well.

The following logic will link the DEA model to the university licensing.

3.3 Stakeholders






  • * Funding establishments-public and private.




  • * Universities: 




  • [Circle] Researchers.




  • [Circle] TTOs.





  • * Private industries.




3.4 Technology licensing process

Rogers et al. (2000) presented an assessment of TTOs and produced in their paper a basic view of the university licensing process. This process view included all of the outputs that we used, except for one output (Y5 US Patents issued). An adaptation of this generic university licensing process is illustrated in Fig. 5 below.

3.5 Universities evaluated and key characteristics

A total of 54 universities were evaluated in our model. This list of universities with key characteristics is presented in Table 2. Key characteristics include whether or not there is a medical school in a given university and the private versus public nature of the institution. With this data, we will examine the relationship between the private or public nature of institutions and the efficiency of these universities based on our DEA results. For the above characteristics, a regression was performed and the results are presented later.

This study has the following research questions as identified earlier:






  • 1. Can we use DEA as a productivity evaluation tool to assess university technology transfer efficiency?




  • 2. Which universities are efficient and which ones are not efficient in transferring technology?




  • 3. Is there a relationship between university efficiency and the existence of a medical school using linear regression?




  • 4. Are private universities more (or less) efficient than their public counterparts in terms of technology transfer?

To answer the above research questions, we developed the following hypotheses:






  • 1. DEA can effectively be used as a productivity evaluation tool to assess university technology transfer efficiency.




  • 2. There are efficient and inefficient universities in transferring technology when compared against the research expenditures and resulting output.




  • 3. There is a correlation between university efficiency and the existence of a medical school using linear regression.




  • 4. Private universities are more efficient than their public counterparts in terms of technology transfer.

Thursby and Thursby (2002) presented a three-stage process using DEA to assess the source of growth in university licensing. This model is summarized by Fig. 3. These authors explicitly decided to exclude two widely exposed metrics in the measurement of technology transfer efficiency: "patent issued" and "licensing income". This is because of the time lag involved in the process of converting key inputs into those two outputs in their respective stages. Rather, they found that using the number of patent applications was more significant than the number of "patents issued" to use as an output for stage 2. As well, "license and option agreement executed" was used instead of licensing income for the final stage.

Secondary goals of this paper include the presentation of success stories within US universities as well as alternative DEA methods to assign weights to certain outputs in terms of the relative importance of these outputs. Additional sub-objectives of this paper are to present target figures for less successful colleges in terms of measured outputs. Finally, we will attempt to determine if there is a correlation between university efficiency and the existence of a medical school using linear regression. In the same manner, we will investigate whether private universities are more efficient than their public counterparts in terms of technology transfer.

The DEA model used was the output-oriented, variable returns to scale envelopment model. DEA was first developed by Charnes et al. (1978). The sequence of linear programs is described in the following approach. A second-phase slack maximization was also performed, but the results were virtually identical, and for the sake of explanation, we will limit our discussion and results to the simpler single-phase approach. This can cause slight variations in the slacks identified. Comprehensive references on DEA (Cooper et al., 2004) and a DEA book focused on the service sector (Sherman and Zhu, 2006) are widely available.For TEXT OMITTED FROM SOURCE Analyze each university, k. TEXT OMITTED FROM SOURCE ...by measuring distance to efficiency frontier. TEXT OMITTED FROM SOURCE ...as defined by producing more output (university technology transfer outcomes) TEXT OMITTED FROM SOURCE ...using no more input (research spending). TEXT OMITTED FROM SOURCE Allow for variable returns to scale by requiring that each university be compared against a full university made up of parts of one or more universities. TEXT OMITTED FROM SOURCE Negative universities cannot be used for evaluation.Next kRepeat for next university.

The value of xi,j describes the amount of the ith input used by the jth university. In our application, we are considering a single input of research spending dollars. The outputs are given by yr,j for the rth output of the jth university. The fundamental decision variables are ?, where ?j represents how much of the jth university is used in setting a performance target for university k. Fractional values of ?j are allowed as this enables a target for university k to be developed from a combination of other universities. A more restrictive extension of DEA is the free disposal hull (FDH) model (Tulkens, 1993) where a decision making unit (or in our case university) is only compared against an entire other decision making unit (university). This would be implemented by further constraining ?j to be binary but was not done in this case as it is quite conservative and may impede revealing "hybrid" operational strategies as revealed by a mix of several universities.

The objective function is simply to find the best possible target of performance for university k and is represented by ?k. Values of ?k>1 indicate that more of each output should be achievable by university k using the same or fewer research dollars.

The DEA input-output model is illustrated in Fig. 4. The model contains one input and five outputs, and the source data for this are from The Chronicle of Higher Education (Blumenstyk, 2005).

4 Results

4.1 Efficiency scores and ranking

Significant results for the DEA model are displayed in Table 3 below. An example of a significant result would be a university that displayed a remarkably large (positive or negative) difference between the ranking based on revenues generated and the ranking based on the DEA evaluation. Key results presented in this table include the DEA efficiency score, TEXT OMITTED FROM SOURCE . The histogram in Fig. 6 shows the distribution of the efficiency scores. Inefficient universities are indicated by DEA efficiency scores greater than 1.000 since this is an output-oriented model.

4.2 Target outputs

Table 3 presents target outputs for select inefficient universities. These targets are formed by linear combinations of the seven efficient universities such that use no more input (research spending) to produce more outputs. The difference between the target outputs from Table 2 and actual outputs in Table 3 is interesting. For example, being ranked eighth in terms of licensing income, Wake Forest University has a target licensing of 57.8 million as compared to the 34.3 million it actually earned in licensing income thereby leaving a slack of 23.5 million.

4.3 Regression analysis

In this section, we will assess, through a linear regression, whether some of the two key characteristics presented in Table 2 above may have an influence on the DEA efficiency score obtained. These characteristics are the presence of a medical school in a given university and the public or private nature of each institution. The results of this linear regression are presented in Fig. 7.

The regression results showed surprisingly little explanatory power TEXT OMITTED FROM SOURCE and statistical significance. The presence of a medical school is close to being statistically significant and with a coefficient of ?0.15 would indicate a strong managerial impact of decreasing efficiency by 15%. Having said that, it would be important to check for alternate explanations such as a potentially longer time lag for medical related research due to regulatory delays. These results are consistent with that of Powers (2003), which used a series of regression models and found that neither the presence of a medical school nor being a private university significantly affected patenting and licensing activities.

5 Discussion

5.1 DEA efficient universities

From Table 3, we see that there are seven institutions that are deemed efficient in terms of DEA: New York University, University of California System, University of Wisconsin Madison, MIT, California Tech, Brigham Young University (BYU) and Georgia Institute of Technology. There are typically multiple reasons why a particular unit is efficient in a DEA study. By its nature, DEA uses efficient universities to form demonstrated targets for inefficient universities. As such, we note the frequency with which the university is used in setting a performance target.






  • 1. New York University (NYU): NYU was deemed efficient because it generated the most Licensing income of all DMUs under consideration. NYU was used in setting targets for 25 of the 47 inefficient universities.




  • 2. University of California System: The University of California System reported results aggregated across multiple campuses. As such, the size of the system made it efficient primarily because it filed the largest volume of US Patents applications and was granted the most patents (US Patents issued). It should be noted that the University of California System's nearest contenders in terms of US Patents applications (MIT and Caltech) only filed slightly more than half as many patent applications. The extremely high values in these two outputs may be indicative of higher licensing income for the future. The University of California System is by far the largest research spender at 2.7 billion-almost twice as high as its closest contender (Johns Hopkins University). Future work may examine issues of economies (or diseconomies) of scale worthy of further investigation as well as examining disaggregated data across the University of California System. The University of California System is quite unique because of the size of the system and not surprisingly was used infrequently in setting targets for only 2 of the 47 inefficient universities: Stanford University and John Hopkins University.




  • 3. University of Wisconsin Madison: While not the best on any single metric or ratio of a metric to research spending, the University of Wisconsin Madison performed well on most of the metrics and is calculated as efficient for this reason. The University of Wisconsin Madison was used in part for setting targets of 35 of the 47 inefficient universities.




  • 4. Massachusetts Institute of Technology (MIT): MIT showed strong performance in terms of Startup companies and was deemed efficient for this reason. The strong efficiency score based on DEA calculations enables MIT to improve its ranking by 11 positions as compared to the ranking based on Licensing income. MIT was used in part for setting targets of 11 of the 47 inefficient universities.




  • 5. California Institute of Technology (Caltech): Caltech received an excellent rating due to three main factors: strong numbers in terms of Startup companies, US Patents applications filed as well as US Patents issued. This is quite an achievement, given its lower than average Total research spending. Caltech made an impressive leap forward in terms of overall ranking (DEA ranking versus Licensing income ranking): 22 positions. Surprisingly Caltech was used in part by every inefficient university in setting targets. Caltech's technology transfer practices may therefore be particularly enlightening.




  • 6. Brigham Young University (BYU): BYU obtained an excellent efficiency rating primarily because of its exemplary numbers in terms of Total research spending. It was used in setting targets for 35 of the 47 inefficient universities, largely to demonstrate how low research spending can still lead to credible output.




  • 7. Georgia Institute of Technology (Georgia Tech): Georgia Tech is perhaps the most surprising efficient DMU. Similar to California Tech, this excellent rating is due to a couple of factors, including strong numbers in terms of Startup companies as well as US Patents applications filed, given its lower than average Total research spending. Georgia Tech, sitting in a tie at first place based on DEA ranking, made the largest leap forward in terms of overall ranking (DEA ranking versus Licensing income ranking): 52 positions. Georgia Tech was only used by one inefficient university in setting a target output-the University of Illinois at Chicago and at Urbana-Champaign.




5.2 DEA inefficient universities

The remaining universities were deemed inefficient relative to the aforementioned seven. The inefficiency scores attributed ranged from 112.5% to 619.3% (Table 4). The score indicates the amount of additional output that should be achievable with the same amount of research spending. North Carolina State University was the "least inefficient" (or closest to being efficient) with a score of 112.5%, indicating that it should have achieved at least 12.5% more of each output than it actually achieved using the same licensing income. Meanwhile, Clemson received the lowest efficiency score among universities that generated more than 2 million in licensing income according to the AUTM survey. We also note how the target is formed as a function of peers (efficient universities). While some of these peers may be surprising, recall that when "blended" in proportions given, they used no more research dollars to produce more of the five outputs.

Here is a closer look at some of the more inefficient universities:






  • 1. Tulane University: Tulane received an efficiency score of 3.633, which resulted in a ranking of 52nd-a drop of 19 positions as compared to the ranking based on Licensing income. Since DEA is a linear program, we can calculate a composite DMU for each inefficient DMUs (university in our case) that is based on a combination of lambda values of a peer group (efficient DMUs). Based on this combination of lambda values, we can calculate targets for each output for each inefficient DMU. As an example, given the Total research spending for FY 2004, Tulane University's Licensing income value could have been 13.9 million higher. A similar derivation can be produced for the other outputs. Tulane was compared to a mix of NYU (16%), Caltech (20%), and BYU (64%).




  • 2. Clemson University: Clemson's DEA efficiency score was the lowest of all 54 universities at 6.193. This is due mainly to poor results as compared to its peers in terms of Startup companies (0), as well as in terms of US Patents applications filed (24) and US Patents applications issued (8). Given its Total research spending, we would expect that Clemson produced numbers in the range of 8, 169 and 50, respectively, for the outputs latter mentioned. Clemson was compared to a mix of NYU (10%), Caltech (31%), and BYU (59%).




  • 3. Emory University: Emory is the university for which we saw the largest dive in term of relative ranking (?29) between placement in terms of Licensing income and rank based on DEA efficiency. Although Emory University's Licensing income appears relatively large, it can be said that it is insufficient given its large amount Total research spending at 326 million. Considering target outputs calculations, Emory's Licensing income should have been at 53.7, almost two and a half times greater than the Licensing income this university yielded for FY 2004. Emory was compared to NYU (15%), University of Wisconsin Madison (23%), MIT (59%), and Caltech (2%).



Overall the results indicate that the average university in this list of high licensing income universities would need to achieve the indicated increases listed in Table 5 to be efficient.

The results in the middle column are conservative in that they are averaged over all 54 universities-both the efficient and the inefficient universities. The results would be almost 15% higher if averaged over only the 47 inefficient universities. Nationwide these results indicate that if all of the inefficient universities were to raise their technology transfer productivity to the level of the seven efficient universities, they would obtain a total of 659 million in additional licensing income for the year 2004. Given the nationwide challenges of higher education funding, even a small fraction of this would be very important.

6 Conclusions and future work

Table 6 summarizes the results identified through testing our hypotheses.

Services such as technology transfer are becomingly increasingly important in today's US economy, and efforts to study them systematically so as to improve them are therefore becoming more valuable. This work has provided an example of how service engineering can be applied to benchmark a complex process such as university technology transfer.

This paper presented our initial effort to explore service industry efficiency through the case of university technology transfer. The performance of universities varied widely and collectively over the 54 universities studied. Seven universities were found to be relatively efficient but collectively the 47 inefficient universities needed to increase licensing income by 659 million, have over 1400 more licenses and options executed, form over 200 more startups, file over 6000 more patent applications, and have over 2300 more patents granted to be efficient given the current level of research spending.

The results indicate that simple explanations such as public versus private and the presence of a medical school do not explain the variation in technology transfer efficiencies. Additional characteristics should be examined in future work such as the number of people working in the TTO, the impact of different intellectual property policies, and faculty incentive systems. Another issue is that of finding the most product scale size and determining if there is a size that is too small to be practical for successful technology transfer. We therefore develop the following propositions for further research:

Proposition 1 Organization structure and operational processes/policies of the TTO impact the technology transfer efficiency.

Proposition 2 The level of authority and support given to the TTO in the university administration impact the technology transfer efficiency.

Examining our results creates further questions on determinants of efficiency. For example, those universities that are in areas of better economic status or those that are located closer to areas of different concentrations such as venture capital companies or financial and high technology company headquarters may have further focus and support for being more efficient. Community support may also be another factor. States that are prioritizing and supporting higher education may have more efficient universities. The following propositions would be worth examining:

Proposition 3 Regional concentrations of venture capital and high technology impact the technology transfer efficiency.

Proposition 4 The regional economic status impacts the technology transfer efficiency.

Proposition 5 The level of priority and support given to higher education in a community or region impacts the technology transfer efficiency.

Future expansion of this study would be to include all the universities taking part in AUTM survey and to revisit the analyses conducted in this study. Additional data may let us identify differences when public universities are compared against the private ones and those with medical schools compared with those without. So we propose that we reexamine those characteristics:

Proposition 6 There are no differences in university technology transfer efficiency between private and public institutes.

Proposition 7 There are differences in university technology transfer efficiency between universities with medical schools and those without.

Also examining the efficiencies through multiple years may give us further understanding of the transfer efficiency. This corresponds to the following proposition:

Proposition 8 There are no differences in efficiency scores of a given university over time unless there is a significant change in organization structure.

Comparing efficiency scores of US universities against those in Canada, Europe or Asia may give us an insight into the sources of technological strengths of different geographies. We would expect similarities with Canadian universities, whereas differences may exist when compared with European and Asian universities due to cultural and legal system differences. Therefore, we developed the following propositions:

Proposition 9 There are no differences between US and Canadian universities in technology transfer efficiency.

Proposition 10 There are differences between US and European universities in technology transfer efficiency.

Proposition 11 There are differences between US and Asian universities in technology transfer efficiency.

Further research to examine the propositions will provide invaluable insight into the factors impacting university technology transfer and will provide excellent guidelines for both university and regional leaders on prioritization of efforts to improve the efficiency.



CONTACT: ?Corresponding author. Tel.: +15037254582; fax: +15037254667.

Source
Technovation
Article Type
Staff News