% fortune -ae paul murphy

Why CERT should be decertified (1)

Last week the CERT Institute developed at Carnegie Mellon University and now part of the "the operational arm of the National Cyber Security Division (NCSD) at the Department of Homeland Security," issued an annual systems security review and summary that drew widespread public attention.

Both the computer press and the mainstream media used this report as the basis for headlines that looked like this one Linux/Unix more flawed than Windows, CERT says from news.com.

Here's the summary from The Washington Post:

According to US-CERT, researchers found 812 flaws in the Windows operating system, 2,328 problems in various versions of the Unix/Linux operating systems (Mac included). An additional 2,058 flaws affected multiple operating systems. There may well have been more than 5,198 flaws discovered this year; these were only the ones reported to US-CERT.

To the uninitiated that's pretty clear - Unix had almost three times as many problems as Windows - but it's the impact on Windows people who might have been considering a change that's most pernicious. Put yourself in their shoes: many of them have a hard time coping with Microsoft's continual patch requirements, now they're told -by the US Federal Government- that Unix is nearly three times worse.

So the question has to be: is there a problem with Unix and, if so, how serious is it?

Lets start by looking at the Unix and multiple operating systems sections of CERT's list. Here are the first ten Unix vulnerabilities listed:

4D WebSTAR Grants Access to Remote Users and Elevated Privileges to Local Users
4D WebStar Remote IMAP Denial of Service
4D WebStar Tomcat Plugin Remote Buffer Overflow
4D WebStar Tomcat Plugin Remote Buffer Overflow (Updated)
Abuse Multiple Vulnerabilities
Adobe Acrobat Reader mailListIsPdf() Buffer Overflow (Updated)
Adobe Acrobat Reader mailListIsPdf() Buffer Overflow (Updated)
Adobe Acrobat Reader UnixAppOpenFilePerform Buffer Overflow
Adobe Acrobat Reader UnixAppOpenFilePerform Buffer Overflow (Updated)
Adobe Reader / Acrobat Arbitrary Code Execution & Elevated Privileges

Two things should strike you about this:

  1. first, that a large number appear to be duplicates - (in fact 1,442 or 62% are duplicates of other listings); and,

  2. none of these seem to be Unix related - they're essentially all application related, and so are the 2,058 classified as affecting multiple operating systems.

Although most of the press simply ran with the numbers a few people did look at CERT's values a little more critically. Most notably, Joes Brockmeier and Barr at Newsforge.com, part of whose review reads:

This is not to say that the data from US-CERT is a meaningless aggregation. You can easily spot the most vulnerable operating system in wide use today by taking a look at the Technical Cyber Security Alerts issued by US-CERT this year. Here's the bottom line:

That's quite a different picture than the one the Microsoft press machine wants you to see. Here's more of the same. US-CERT's list of current vulnerabilities contains a total of 11 vulnerabilities, six of which mention Windows by name, and none of which mentions Linux.

They're right, but it's actually a lot worse than that.

Many of the listed vulnerabilities aren't even backed by CVE listings - but consider one that is: #231 - "Debian Lintian Insecure Temporary File". Cert describes this as:

A vulnerability exists because temporary files are created in an insecure manner, which could let a malicious user delete arbitrary files.

The alert was processed by CERT in January of 2005, but originated as a a bug report in June of 2004. Here's the Mitre CVE listing for it:

lintian 1.23 and earlier removes the working directory even if it was not created by lintian, which may allow local users to delete arbitrary files or directories via a symlink attack.

Mitre's source for this is a December 2004 Debian patch report.

Follow that up, and you find that the vulnerability was more imagined than real. Here's what Jeroen van Wolffelaar, the maintainer for that code had to say when the issue came to his attention on December 19th:

I noticed this before, but at that time didn't think it was a security issue. Directory creation would simply fail if that name is already taken, and the cleanup afterwards is harmless. If the name is not yet taken, no issue.

However, when re-reading, I see that this assessment was a misreading of the sources.

But a day later he reports:

Argh, after looking again, I still stand by my initial assassment, I was misleaded by the theory that the logic was bogus. The key point is:

| if (not -d "$LINTIAN_LAB" or ($lab_mode eq 'temporary'))
| mkdir($LINTIAN_LAB,0777) or fail("cannot create lab directory $LINTIAN_LAB")

And, this is correct. If $lab_mode is not temporarily, a lab location was specifically given to lintian, and we should assume that the invoker of lintian in that case knows what he does. In all other cases, i.e., lab_mode equals temporary, the condition in the if is true (note the 'or'), and the lab dir is unconditionally tried to be made, which fails if it already exists.

In other words, the problem never existed, was (erroneously) reported in mid 2004, cleared in 2004, and counted against Unix in a 2005 summary claiming the authority of the United States Government.

It's not the only one either. The overwhelming majority of the Unix and applications vulnerabilities listed either can't be exploited in current products, require absurdly unlikely circumstances to become actionable, or have no factual basis.

That same liberal use of imagination affects a lot of the listings in the section on multiple operating systems vulnerabilities as it affects applications that originate primarily on Unix. Consider, for example, this rather desperate attempt to see a vulnerability: in a widely used Unix application:

The apache2handler SAPI (sapi_apache2.c) in the Apache module (mod_php) for PHP 5.x before 5.1.0 final and 4.4 before 4.4.1 final allows attackers to cause a denial of service (segmentation fault) via the session.save_path option in a .htaccess file or VirtualHost.

This is based on the following:

From: Eric Romang / ZATAZ.com (exploitszataz.net)
Date: Mon Oct 24 2005 - 02:36:38 CDT

Hello,

Here under some stuff to dos apache + php just through an htaccess.

* With .htaccess method :

If you have into your php.ini -> safe_mode = On

Simply put a .htaccess file on the root directory of your website with this content :

php_value session.save_path /var/www/somewherehowexist

Apache segfault with :

[Fri Sep 30 10:33:11 2005] [notice] child pid 17743 exit signal
Trace/breakpoint trap (5)

There was a bug in the apache2handler SAPI, sapi_apache2.c file, that made this segfault here possible, the bug now is fixed upstream and 5.1.0 final, 4.4.1 final and the next 5.0.X release will have the patch.

Also work with session.save_path into a VirtualHost.

Gentoo bug report :

http://bugs.gentoo.org/show_bug.cgi?id=107602
and
http://bugs.gentoo.org/show_bug.cgi?id=98871

In other words, this was a bug that found and fixed in pre-release testing.

CERT's excuse here is that it just publishes the facts - a list of claimed vulnerabilities - and if the public doesn't read the fine print, well, that's not their problem is it?

But, of course, it is. CERT is government agency: higher standards apply.

CERT issues this kind of public information in the full knowledge that both the mainstream media and the technology press will run with it. Basically, they can't help but know that they're misleading the public and that's utterly irresponsible at best and dishonest at worst.

CERT isn't responsible for educating the mainstream press, but they can, and should, respond to predictable ignorance among their primary audience by providing a more meaningful metric than simple counts of claimed vulnerabilities. Something along the lines of cancer rates per 100,000 population would work nicely - weigh the risk associated with each vulnerability according to the number of systems per 100,000 in the installed base that are actually affected and you'd get a reportable statistic ranging from zero for most of the claimed Unix vulnerabilities to nearly 99,999 for something like the current WMF scare (reported, by the way, in 2005 but not counted against Microsoft in this list).

Periodic summaries, like the current year end report, could then give cumulative risk statistics in percentage terms - something the main stream press could easily understand and pass on to its audience.

I'll have more on this next week but for now - if you're American, how about calling or writing your congressman and senator and raising the issue with them? Barring that, if you're a Unix vendor and you've lost a sale to security concerns - consider asking a lawyer how much liability CERT could be assessed for.


Paul Murphy wrote and published The Unix Guide to Defenestration. Murphy is a 25-year veteran of the I.T. consulting industry, specialising in Unix and Unix-related management issues.