UW Continues to Ride Fake Startup Metrics

From 2008 to 2015, the University of Washington faked its startup metrics and won itself awards and reputation for its entrepreneurial and innovative chops. (See articles discussing UW fakery here and here and here.) Senior UW administrators concocted a story about startups and repeated it regularly, expressly endorsed by the university’s president. Even with changes in the UW president and licensing office, the new officials happily adopted those same faked metrics and made them their own.

Here we are in 2018, and the university continues to promote itself based on its fake metrics. Here’s a current “CoMotion” banner ad that shows up at the bottom of many of its web pages:

[Update–UW is still at it. Here’s the current banner footer, July 2020:

Notice that they cite startups from 2017, and the Milkin report is still the 2017 one, although it is, um, let me see, midway through 2020. It’s not that they are even trying to be accurate. The Milkin survey–as pointed out below–is still laughable. The primary measure of “innovation” is how many academic papers from a given university get cited in patents–but patents cite literature to demonstrate that the claimed invention *is not that* and *is not obvious given that*. Just the opposite of what the doofs at Milkin have it. Other measures figure, as well–mostly, apparently, to show off the Reuters patent database as a marketing thing. Other than UW’s *volume* of publications, grant income, and non-selective patenting, it would appear that university #6, University of North Carolina, is more effective at technology transfer. But ranking institutions for “innovation” is a nonsense game, played by folks who delight in the rhetoric of nonsense. “Innovation” is not to be conflated with patents (they could block and delay), licensing (just deepens blocking and delaying), income (could be from speculative trading on things never brought to market or price gouging on things that are), or fake, bureaucrat-enabled bean-counted startups (enough said).]

Let’s consider the university’s claims. Let’s start with number of startups launched. CoMotion cites its source as the Association of University Technology Managers (AUTM). Each year AUTM conducts a licensing survey. Not all universities participate–including some with major research programs. The thing about AUTM’s survey, however, is that it merely reports what participating universities report to AUTM. For years, UW faked its startup metrics and reported more than double the actual number of startups that met AUTM’s criteria for a reportable startup–essentially a company formed for the express intent to license an invention from a university, whether or not the company has finalized that license in the reporting period for the AUTM survey.

When the University of Washington’s CoMotion (and before that, C4C, and before that TechTrans, and before that the OTL, and before that the OTT) reports startups to AUTM, AUTM merely then reports that same count in its survey report. The only thing that AUTM does is provide counts provided by survey participants. AUTM does not audit the figures to determine whether a university is ignoring AUTM’s definition of a startup or is otherwise faking the numbers; nor does AUTM attempt to disaggregate startup reports to determine if multiple universities are claiming the same startup (which happens, and isn’t a problem other than if someone, say at AUTM, adds up the totals for reported startups and states that this figure is the total startups and not just the sum of the reported startups (which almost always contains multiple duplicate reports)).

Thus, when the UW reports that it is #7 in AUTM’s count of startups, the UW is reporting that its fake startup metrics show up at #7. AUTM is not the source of the startup count–UW is. CoMotion is playing the public for fools by citing AUTM as the source for its own–faked–information.

Consider next UW’s claim to the seventh “best university for technology transfer”  according to a Milken Institute report. Here is the relevant table showing how Milken’s algorithm determines what’s “best”:

It’s a strange algorithm, since it counts patents, licenses, income, and startups, but doesn’t associate licenses with either patents or startups and does not consider the distribution of income among licenses–one big hit license means the income field does exceeding well, even if a university licensing program is an abject failure for two decades in all other matters–sure, lots of licenses “issued”–for software products, biomaterials, for junk thrown around in consortia–but not necessarily licenses that result in any actual beneficial use. Show licenses, resulting in nothing, but the license numbers are up and the income figures (big hit for 20 years!) are way up, so the stoopid Milken algorithm says this program is tops.

In the Milken fantasy world of best technology transfer programs, the number of startups formed counts (arbitrarily, it appears) for 35% of the total score. Where does Milken get its numbers for startups? Why, from AUTM (except for the University of California numbers, which it gets from UC):

The Index is based on data collected by the Association of University Technology Managers (AUTM) via the AUTM’s Annual Licensing Activity Survey, with one exception, the University of California System.

So UW’s faked metrics on startups get supplied to AUTM, which launders them, and then Milken can use them as input into its algorithm–garbage in/garbage out. Thus, when CoMotion cites Milken’s praise for UW’s technology transfer program, CoMotion is citing a primary effect of UW’s fraud on the public by faking its startup metrics.

Here’s what the Milken report has to say about UW:

University of Washington (UW) ranks seventh, an appreciable increase from 24th in 2006, with an index score of 95.11. UW had the highest score on our licenses issued sub-index, the only institution to file more than 1,000 between 2012-2015.

The license count is high because AUTM counts as a license every tech transfer deal worth at least $1,000, regardless of the IP involved. When I was at UW, I set up a licensing program for software and digital media that was based on projects rather than patents as the primary intangible asset. Those leading each project then could decide how the project would make research assets available–many chose open or open with support services or available at cost or available for a price that supported the distribution and support activity. A given project could have scores  (in some cases hundreds) of licenses, and each license that included a support option renewed annually. You can see how software, data, and digital media projects could easily ramp up the license count reported to AUTM.

UW does indeed then issue plenty of licenses–it’s just that most of them don’t involve *patents* and aren’t for *commercialization*–they are project-based or involve biomaterials; these licenses confirm access, regulate dissemination of changes that haven’t been validated, ensure interoperability, and provide cost recovery for distribution and support. An equivalent would be the subscription fees charged by journals or registration fees to attend workshops or to participate in a job fair. A cover charge–then drink yourself as silly as you want, or not.

But Milken’s algorithm is too dumb to recognize different approaches universities might take to licensing. Yes, the UW approach is way more effective at getting research materials out in useful, supported form to a broad industry-based research community, but that isn’t the “technology transfer” that Milken’s report so values.

It’s not that UW here is faking its licensing metrics–I expect that UW reports to AUTM with regard to licensing what AUTM’s survey defines as a reportable license (unlike how UW has reported startups). It’s just that AUTM’s definition of a reportable license is at odds with AUTM’s claims regarding patentable inventions. UW exploits AUTM to leave the impression that UW is doing a great job licensing its patent portfolio, when that’s not likely the case.

Of course, no university reports how it is doing with licensing its patent portfolio, and AUTM’s treatment of patents and licensing makes sure that no one ever does find out. The patents issued in a given year bear almost no relationship to the licenses granted in that same year. If we wanted to see how a university patent portfolio is doing with regard to use, we would ask to see the patents in the portfolio, and for each whether the patent had been licensed, and whether the license had resulted in (i) practical application and (ii) benefits to the public on reasonable terms. It wouldn’t matter if a patent had been licensed for a huge amount, for $1,000, or royalty-free. Heck, under AUTM’s survey criteria, it appears that a royalty-free license isn’t reportable.

License income was another strength, generating $42.8 million in 2015.

2015 was about the last year of the Hall patent money, which accounts for something like 90% of the reported total. Since UW’s income is reported as both WRF’s and UW’s, it isn’t clear whether Milken (or UW reporting to AUTM) totals both income amounts–since WRF pays a share of Hall royalties to UW. In any event, CoMotion is not responsible for the Hall licenses–the WRF did those deals way back when in the 1980s, and UW technology transfer under various names and structures has lived on its “share” of the payment made by WRF each year while its own licensing income streams have withered away. Again, the Milken algorithm is stupid to how income is generated. UW could run the most gawd-awful technology transfer program for three decades and the Milken algorithm would still swoon over the income from a single invention.

Restructured as CoMotion in 2015, UW broadened its TTO mission to better integrate the university into the innovation ecosystem.

Silliness. Things here are backwards and incomprehensible. What does it mean to “broaden” a “TTO mission”? Perhaps that means to move beyond licensing patents for commercialization or the creation of startup companies to take licenses to patents because no one else will. But what does that mean? It can’t mean moving to non-exclusive licensing of intangible assets–because that was already part of the “TTO mission.” It would appear that the “broadening” here involves activities that don’t involve technology transfer from the university to anyone. The university supplies space for startups, or invests in startups that don’t have anything to do with university-developed research technologies. I suppose–but that’s not activity that measures the bestedness of a university technology transfer program. No, what CoMotion apparently means is “innovation” transfer rather than “technology transfer”:

CoMotion labs, innovation grants, and mentorship all aim to foster innovation transfer in a wide range of fields, with wet and dry labs along with office space available for 30-50 start-ups in their incubator.

At this point, it’s not at all clear what “innovation transfer” means, given that any number of universities offer access to labs and allow their faculty to consult with industry. Even federal labs have their CRADA programs. It would appear that “innovation” here is wording that means “do what a bunch of other universities are doing” so they can claim credit for “helping” whatever company might end up becoming successful–and adding that success to whatever little they are actually doing to advance the beneficial use of inventions made in their own research programs–you know, the reason for that TTO mission in the first place.

The UW wet and dry lab space would be Fluke Hall, which UW took over for its own startups and chased out the industry-directed groups for which John Fluke of Fluke Instruments made his generous donation that funded the construction of Fluke Hall.

A leading recipient of federal research grants, UW spent $1.4 billion on research in 2015.

Not quite. UW may have received $1.4 billion in 2015. According to the NSF, UW reported that it spent $1.18 billion in 2015. The figure is about 15% lower than what UW reports out. UW serves as a major federal grants aggregator and subcontracts out grant funds received to various partnering institutions. It’s an easy mistake and UW doesn’t go out of its way to correct the higher figure, which is all about conditioning public and legislative opinion, not feeding data into an algorithm to decide who is doing technology transfer best.

The UW School of Medicine attracted half of the research dollars in 2016, and 27 percent of the start-ups in the CoMotion labs were in the biotech and healthcare field.

Funny, how UW startups are so much weaker in biomedical than the funding to the School of Medicine would indicate–half the funding but only one quarter of the startups. What’s up with that?

Information technology and software firms made up 26 percent, with Apple acquiring Turi for $200 million in 2016.

Milken makes it appear that Turi is the result of CoMotion activities. Even CoMotion cites Turi as one of its “startups.” But Turi started as an open source project called GraphLab when Carlos Guestrin was at CMU. Guestrin built a startup around that open source project, first called GraphLab and then, with venture funding in 2015, called Dato. According to the USPTO patent database, Carlos Guestrin’s related patent applications are assigned to Dato, not to UW. Dato then changed its name to Turi after a trademark dispute. GraphLab/Dato/Turi was not formed to take a license from UW, and inventions made at GraphLab/Dato/Turi stayed at Turi–weren’t owned by UW, weren’t managed by CoMotion, weren’t *transfered* by CoMotion. Dato/Turi doesn’t show up at CoMotion’s list of “startups.” There are limits, apparently, to the startup numbers UW is willing to fake. But you see, CoMotion has “broadened” its mission–apparently so UW can claim anything remotely “innovative” as an outcome of its “mission.” Lame.

In a sense, the Turi story is a wonderful story about effective university technology transfer that doesn’t involve a TTO or its mission: open source at CMU; faculty leader moves to new school and still has access to the source; creates a company wrapper outside UW’s claims; secures venture funding; and the company is acquired; the faculty founder ends up in a high powered position in a major technology company. Very cool. UW’s CSE department now has a friend in a high place–one of the best bits of NIPIA that one could desire.

Nothing in this scenario matches anything in university descriptions of the “technology transfer” process they demand. But it does illustrate an effective approach–open access, wrap a company around services and directed development, get financing or sales, sell up the directed assets, start a new round of development within a corporate setting for maximal deployment and benefit. But that’s not what Milken or AUTM has been selling. It is, however, what Research Enterprise has been all about, make-use commons, techniques for crossing a commons, strength of weak ties, NIPIA as an alternative to IP (and especially NIPIA in networks of relationship) and using IP to create other intangible assets of value.

UW has partnered with major local firms on CoMotion initiatives, including Microsoft on the Global Innovation Exchange and Amazon on the Amazon Catalyst program, which award grants of up to $100,000 to UW innovators who address large, real-world problems.

Very nice stuff. It’s difficult to see how these were CoMotion initiatives, however.

Now let’s take a look at the first CoMotion claim, that UW is the most “innovative” public university in the entire world. The ranking is provided by Reuters. It appears that the idea behind the ranking is to give visibility to Reuters’ $3.5b spinoff of its Reuters Thomson databases of patents and publications to startup Clarivate Analytics.

To create our ranking of the world’s most innovative universities, Reuters News relied on data compiled by Clarivate Analytics and several of its research platforms: InCites, Web of Science,  Derwent Innovations Index, Derwent World Patents Index and Patents Citation Index.

In a way, the Reuters ranking of innovation was clever marketing to set up a spin off deal. Here’s the ranking methodology. They start with the universities that show up the most frequently in their database Web of Science Core Collection. Then they use their patent databases to cross-correlate with the universities with the most publications in the publication database. Then the list is restricted to the universities that filed at least 70 PCT patent applications (50 for non-US universities).

In the US, a PCT application costs about $3K (patent firms’ prices vary!) in addition to the cost of a US utility patent application filing, and with modern patent administration software amounts to a keypress to produce a conforming PCT application from an existing US utility patent application.

Then, a strange set of comparisons get made:

Each candidate university was then evaluated using various indicators including how often a university’s patent applications were granted, how many patents were filed with global patent offices and local authorities and how often the university’s patents were cited by others. Universities were also evaluated in terms of how often their research papers were cited by patents and the percentage of articles that featured a co-author from industry.

That is, a bunch of correlations between publications in the now ex-Reuters Thomson publication databases and patents in the now ex-Reuters Thomson patent databases. The various correlations were then assigned a score and the scores added together to create a total. The totals then can be ordered from higher to lower. The result is a list of universities. That these universities may be described as “most innovative” is strictly a matter of the marketing department at Reuters deciding to call their index one of “innovativeness.” How on earth?

We can have different ideas about innovation, of course. We could go with Everett Rogers’s idea that innovation is anything new to an adopting group. Or we could go with the OECD definition of product innovation: “a product or service that is new or improved.” Or the Cambridge Dictionary definition: “(the use of) a new idea or method.” Or how about this definition from a UK business school: “Innovation is the process by which new ideas are successfully exploited to create economic, social and environmental value.”

Or maybe we go with Benoît Godin’s extensive documentation of the varied ways in which “innovation” has been used–including “introducing change into the established order.” By any of these definitions, the Reuters concoction of patents and publication citations has nothing to do with new products, services, exploitation of new anythings, or changing an established order with them.

At best, Reuters’ former databases track announcements of what’s publishable and spending on claims of what is patentable. But what is published and what is patented has next to nothing to do with what that’s new gets used successfully–or to disrupt or change an established order. It’s hard to see how what gets published from a university, or what gets patented by a university, has anything to do with “innovation.” Indeed, by patenting inventive work, a university may well prevent people from adopting the inventive work–everyone must avoid that work because the university does not make the work readily available and instead holds out for an exclusive licensee, which rarely shows up. Thus, a counting of patents is much more immediately a measure of anti-innovation.

The Reuters algorithm does not even consider licensing or transfer of any of the ideas or inventions reported by publications or patents. And even licensing does not mean “innovation”–it just means that a university has formally promised not to sue certain others for infringement. There’s nothing to indicate that those acquiring a license ever use–especially successfully–what they have licensed. Many “commercialization” patent licenses never result in commercial product–the license is just another indication that a given invention is unavailable for use pending some company futzing around with it for years before abandoning it.

There’s nothing “innovative” about what Reuters measures. Reuters just free associates presence in its former databases with innovation and UW is fool enough to adopt such a dysfunctional assertion of “innovativeness” simply because it found a company willing to make that assertion. Sure, the public may be snookered by fancy-dancy databases. But the UW ought to have the integrity not to make such goofballedness an anchor of its publicity.

We can see, then, that CoMotion takes the fake startup metrics produced by its predecessor C4C and rides them to make it appear that CoMotion is a world leader in technology transfer. It’s a form of self-flattery put out as if objective information. If CoMotion is a liar about its own status, what else might we expect by way of credibility in its other technology transfer practices? Falsus in uno, falsus in omnibus. 



This entry was posted in Bozonet, Metrics and tagged , , , , . Bookmark the permalink.