Sunday, April 29, 2007

High value higher ed and compromise profiles

Dave G from Matasano Chargen posted about "Mac Punditry and the Office Paradox". My higher ed focused ears perked up when he asked "How are educational environments high value targets?".

Higher ed security folks know that we're major targets - and attractive targets too. Here's why:

  1. Open networks - large public IP spaces, often with relatively loose border controls.
  2. Open systems - frequently systems are not centrally supported, and there can be a wide separation between system security postures across the network.
  3. High bandwidth - universities have bandwidth that many commercial entities and ISPs would be jealous of. Internet 2 connections are typically at least a gigabit, and some universities connect to things like the TeraGrid's 10 - 40 gig research backbone or other specialized high speed networks.
  4. High value data - Social Security numbers, research data, personal data on students, faculty, staff, and donors, and credit card operations are all on the list for most higher ed institutions.
Historically, educational institutions have had relatively open borders. This has changed over the past few years with universities implementing border firewalls and other protections. Universities are still in a somewhat unique position - many have residential networks that act as ISPs for their students and have standards for academic and research freedoms that make a corporate style security architecture difficult, if not impossible.

Watching vendors and other security professionals react to statements along the lines of "well, no, we can't presume that it will be firewall protected" or "we have a class B, and everything we own is on a public IP" can be fun if you enjoy looks of sheer terror.

With attractive and relatively exposed systems - and a population that is often less formally controlled than those in the corporate world (at least in similarly sized institutions), compromises will occur. What do they look like?

Higher education security staff tend to see three attack profiles on systems - and I think that these three attack profiles show the three types of value that compromised systems have for attackers. They are:

1. Zombies - low value systems with no real useful data, smaller hard drives, and low to middling bandwidth. Often these are student machines on residential networks, or standard employee desktops. These systems tend to participate in botnets, and are valuable simply because of numbers and the fact that many are either not detected, or are not cleaned up properly if they are.

2. Storage and bandwidth - these systems are characterized by larger hard drives and bigger pipes. Universities often have relatively large pipes, either thanks to an Internet2 link, a research network link, or simply large commodity Internet upstream connections. Systems that are compromised by an attacker who actually cares about system profiles and that do have bigger drives or more bandwidth continue to be used as storage and distribution points.

3. Jumping off points - every organization has hosts that have sensitive data or that can be used to move through other systems. Your local IT support staff machines are a great jumping off point, and so is a dean's machine, or a business office system. Normally, you will want to focus your forensic efforts on these systems, as they are more likely to have sensitive data - or the keys to the kingdom. A single critical IT worker's machine can let attacks romp through your infrastructure. Most hacks of these machines tend to be detected via system monitoring software - AV and anti-spyware, via user notification, or via one of the methods above.

So how do you deal with these compromise profiles? Your detection and response tactics will likely vary due to your level of control and access.

Zombies can often be found by by monitoring outbound IRC connections or by using netflows and other monitoring technologies to keep track of known bad hosts on the outside. Most normal systems on campus won't be talking to your friendly neighborhood C&C. Since these systems are often outside of the normal support infrastructure for business or academic computing organizations, you have to work with your residential network support or enforce your AUP (you do have an acceptable use policy, don't you?)

Storage and bandwidth oriented hacks may not be reporting into a central C&C - although more and more do dial home. Use your border flows to check for hosts that have very high bandwidth usage - a good tactic is to check your top 10 or top 20 hosts on a daily basis, then filter out the known good hosts, check into the rest, rinse, and repeat. Yes, that public FTP mirror will be high, but why is a grad student's desktop machine in the top 10 for outbound traffic?

Jumping off points are the security analyst's bad day - follow your IR procedures, and make sure you understand what the system had on it, and what access the people who used the system have to other resources. The chain can be long, but following it can help ensure that further compromises don't occur.

The final analysis - at least from my perspective is that higher ed does have high value hosts. Educational institutions cover the spectrum of sensitive data from research data to SSNs and credit card data. In addition, most are highly concerned about their image, meaning that a data breach can cause significant damage to reputation, even if financial losses are smaller.

Where does that leave us? I'll write more about some of the directions that universities are moving in to handle both intrusion and extrusion detection in a coming post.

Thursday, April 26, 2007

What's old is new again: email extortion and urban legends

Much like fashion, the Internet makes old things new again at a startling pace. Dark Reading is carrying an article courtesy of Information Week about a "new" email scam - assassins have been hired to kill you, and if you bribe them, they won't. Unfortunately...this isn't really a new scam. In fact, Snopes has references back to 2006, and an FBI recommendation from December of 2006 - which the article does note. What is newer is that the mailing lists harvested for it seem to target professionals. Not quite spearphishing, but definitely more targeted than your daily allotment of prescription drug and enhancement spam.

Moral of the story? Keep Snopes, ScamBusters, and the CIAC's Hoaxbusters sites handy, and don't panic. If you do know somebody who has succumbed to the scam, point them to law enforcement and the Internet Crime Complaint Center (IC3).

Wednesday, April 25, 2007

If you're still shopping here, do you mind if we lose your data again?

There's a post at Emergent Chaos about how many customers you may lose if you lose their data more than once. I recently asked if you would still shop at a company that lost your data. These statistics are interesting, as they show that at least in some populations, there is a direct churn rate effect from repeated data loss. The question remains: what about institutions that you're not a customer of, but instead belong to a population that they service.

How is that different? Well, there are organizations like universities and the VA that will retain records on you long after your a bank, credit card company, or other institution would have hopefully destroyed your data. You will always be a graduate of your alma mater, and you will always be a veteran - and they will retain your data. Another group that we have little choice in dealing with is credit monitoring and reporting agencies - a breach of one of the major agencies could have serious repercussions. We've already seen third party processors announce compromises.

How can you protect yourself in this case? The responsibility lies with the data holder, and that is what should concern us - in some of these cases, there is no motivation to retain customers, we have no way to remove our data from their databases, and in many cases, there are few penalties for losing data that isn't protected via legislation. We may at least hear about it thanks to legislation that requires reporting- and that's a start.

Tuesday, April 24, 2007

Security deals: Free for life PKWare SecureZIP for Windows

PKWare has SecureZIP available as a free for life download. SC Magazine reviewed it, and seemed to like it. It looks worth checking out if you're looking for something that integrates into Outlook and other software for free.

Monday, April 23, 2007

Paranoid yet? Van Eck Phreaking your LCD

Most of us don't play in the uber-paranoid world of that worries about Van Eck phreaking, but New Scientist has an interesting short article about the possibilities of Van Eck phreaking LCDs. For years most of us had been under the assumption that it was nigh unto impossible to grab enough RF from a laptop or LCD display to view the image - that's not always the case.

Time to run Tinfoil Hat Linux!

Friday, April 20, 2007

Web hacks - gzinflate and base64 encoding

A recent exploit I ran into dropped a remote administration console into a PHP script. I've seen similar exploits before, but this was the first time I had personally run into PHP exploit code nested in the script that used this bit of PHP:

eval(gzinflate(base64_decode('encoded file...')));
This is reasonably clever, as it keeps the file size down and makes the exploit a bit harder to figure out (should we call that insecurity through obscurity?). Vendors also do this at times to try to protect their code, although it is not a very effective reverse engineering prevention scheme - it keeps casual viewers from stealing your code.

How can you deal with this, especially if you're not a PHP wizard? Well, fortunately, the answer is pretty simple. A number of pre-built decoders exist, and my favorite of the day is from Steveandthesoftware. (the site is currently not responding, so you may have to check out the Google cache). There's another simpler script here, and you could always build your own if you're so inclined. Some tools also include base64 decoding - openssl does, as does the ever so useful Paros proxy tool. Even the L33t Key plugin for Firefox has base64 encode/decode support.

Today's lessons?

  1. Use Tripwire or another file integrity monitor on your web scripts and other content that doesn't change often.
  2. Have a known good, clean backup of those scripts.
  3. Make sure your web server has permissions set appropriately on all of the directories - if your web server user doesn't need write access to the directory, don't give it!
  4. Limit the permissions that your web server user (such as www or apache) has. Fortunately, this is the default for most modern servers. If your web server usage model permits it, an Apache chrooted jail may be a good idea.
  5. In the event of compromise, rebuild if you can, and if you can't, make sure you're using those known good backups.

Wednesday, April 18, 2007

Would you shop at a store that lost your data?

Would you shop at a store that lost your data? Would you attend a university that exposed your SSN? Would you donate to that university? If you were a veteran, what would you do if the VA lost your data and didn't know where it went? Do you feel better knowing that your data is out there, and does it worry you to think that many organizations don't announce breaches of your private data?

There seems to be a split between consumers response when polled and their actual behaviors. In the articles I've linked about consumers in the UK, the majority of those polled claimed that they would take their business elsewhere if their data was exposed. The counter article notes that TJ Maxx has gained sales in the past year - in fact, they saw 6% gains in March alone. It may be that notification requirements won't kill our organizations, even if they are embarrassing.

Does this point to consumers accepting data security breaches as commonplace? Stories of people receiving multiple notifications from different organizations in a day or two are floating around, and consumers rail against irresponsible companies. Even if consumers are fed up, it seems that we, as consumers, may be reaching the point that we become used to our data being exposed. The cost of doing business in our current information society is that our data is at risk, and we frequently cannot control how much data companies gather about us. Even if we limit what we give one company, our data can be correlated to data in other databases.

What is the moral of the story for organizations? Is it "Consumers will come back" or is it "Breaches can kill you"? Only time will tell. It may be that in the near term, the huge numbers of organizations leaking data will cause consumers to become inured to the data losses. The legislative backlash that we are seeing nationwide will surely have some effect, both due to required reporting and due to more stringent requirements.

Where does that leave the security professional? With questions, of course:

  1. Are you required by state or federal law to notify, and if so, how, under what circumstances, and how quickly?
  2. What is your organization's breach notification policy?
  3. Do you have procedures in place to handle breach notification?
  4. Do you have an internal communications plan?
  5. Have you looked at insurance? Data breach and information security insurance is just becoming available, and it may be worthwhile for your organization. Dennis Trinkle mentioned insurance in his presentation at the Indiana Higher Education Security Summit - the insurance is out there if you're looking for it.
For now, security folks need to keep track of the laws - both those that are on books, and the bills entering both state and federal legislatures. If you have vendor requirements like PCI you need to make sure that you meet or exceed them. You also need to make sure that our own policies and procedures keep up with both law and other requirements.

Tuesday, April 10, 2007

Hard drive encryption and breach notification

I've had a number of conversations about SSN disclosure laws recently, and they're a major topic in the higher education security space. If you're an Indiana resident, last year's SSN release law and the breach disclosure laws for both private and public institutions made life more interesting. The usual disclaimer applies - I am not a lawyer, and you should talk to yours definitely applies here. With that said, the Indiana laws and others include a possible out for organizations - for example, the state agency version reads:

"Sec. 5. (a) Any state agency that owns or licenses computerized data that includes personal information shall disclose a breach of the security of the system following discovery or notification of the breach to any state resident whose unencrypted personal information was or is reasonably believed to have been acquired by an unauthorized person."
The key here is the "unencrypted" - if you can reasonably state that the data was encrypted, and could not be accessed, then your reporting requirements typically are lessened, if not largely removed. It is worth noting that in general, the laws do not specify an encryption technology - presumably ROT13 would be considered a less than good faith effort.

Does it now behoove you to encrypt everything? Possibly, if you deal with covered data everywhere, but few organizations do. At the least highly sensitive systems, or systems that are likely to be exposed should be considered in your remediation plan. Most organizations are considering laptops and PDA/smartphone devices as part of their first round of targets. Both of these are likely to be exposed or stolen, and often contain local copies of sensitive data. The past 12 months have shown a number of cases of stolen or missing laptops.

From a technical perspective, you have a few options:
  1. Full drive encryption software. Utimaco's product is a good example of this. The best part about software of this nature is that it can be installed on existing systems. Pay particular attention to backups, key escrow, and OS and hardware compatibility. You're sure to find some systems that your encryption scheme just won't work on - many products are Windows-centric. There are some real benefits to full drive encryption, including temporary space and virtual memory being encrypted. There is also typically a real performance hit, particularly for disk intensive activities such as using a virtual machine.
  2. User directory encryption - things like Bitlocker and FileVault can easily encrypt user directories. The caveat here is that you have to make sure that data isn't stored elsewhere - a single application that stores data elsewhere, or a user who stores to the unencrypted root directory of the hard drive can make your encryption useless. In the case of FileVault, you'll also need to make sure you figure out a method for managing the master password. There is still a performance penalty, but general non-user directory software should run at normal speeds - the performance hit is constrained within the bounds of files in the user's encrypted directories.
  3. File or volume encryption. Security professionals will probably point you to software like TrueCrypt for this. You must encrypt the file or volume and unlock it when you use it. This does not account for memory resident data, nor does it handle temporary files, thus making it far more difficult to claim that data would not have been potentially exposed.
Should your organization use an encryption product? If you deal with sensitive data, and want to safeguard it, you should definitely take a look at the products on the market. If your organization is subject to a disclosure law, encryption products may also provide a needed protection in the form of both a reputation and a data disclosure control.

What about cost? In most cases, encryption products can be had in a variety of price ranges - Bitlocker and FileVault are free with Vista and OS X, while commercial solutions range in price up to a few hundred dollars per machine. As with any security control, you'll need to gauge the cost and benefit to determine where and when you should deploy the technology.

Wednesday, April 4, 2007

Social engineering in the workplace: avoiding the "evil security guy" tag.

Security is evil. We say no (that's "default deny" to you), we enforce policies, and we make life difficult.

We ask hard questions, we poke holes in the beautiful software that you just wrote, and worst of all, we take up your time that could be spent on more important things than doing security assessments and configuration checks.

Really, we just get in the way.

Or, at least, that's how it feels a lot of the time. Then, a security event happens, and suddenly the security guy has a new shine!

Sadly, that's not the best way to work with IT staff. Security needs to work effectively with IT staff and the organizational community all of the time, not just during emergencies. What can we, as security professionals, do to build ties with the administrators, staff, and other employees? I have a few favorite tactics to make security staff more available to the rest of the organization.

Lunch with the security guys

Every few weeks, I eat lunch with a group of systems administrators from across the organization. It isn't a formal meeting, just a meeting of fellow geeks. Often, more useful information is exchanged at these lunches than in a week of meetings - and more happens as a result of them.

The real security benefit, however, is that I'm available as a resource in a non-formalized environment. Questions that never come up elsewhere are brought up, discussed - and often I don't have to provide an answer. The others in the group will already know it, or they've heard it from me before.

The key here is being approachable - the same ideas work for CISOs and other security management professionals - up to a point. Define a line of professionalism, but make yourself available.


In larger organizations, security staff may only heard of when they're bringing trouble to your door. It helps to build ties before the event. I've made a habit of getting out periodically and chatting with various contacts across campus. They tend to refer things to me, and it helps me keep my finger on the pulse of the IT community.

There is a balance here - at some point, being heads down working on security projects is more useful, but making these contacts can be an effective part of a security outreach program.


We all like to read about social engineering - why not do some of your own. If you walk into my office, you'll find a bright yellow 1960's Civil Defense Geiger counter, magnetic building toys, and a selection of candy all out and easy to get to. Why?

They lure in IT staff.

I've had more conversations because of someone wandering down the hall and spotting the Geiger counter through my open door than I can count. The candy means that people make a stops, ask a question, grab a handful, and meander back out. The building toys give engineering types something to play with as they explain their problems.

Simply taking the time to chat with the folks you work with is a valuable security tool. Yes, the "evil security guy" image is useful at times, but having an IT community that willingly comes to you before the problem becomes an issue is worth the trade.

Tuesday, April 3, 2007

Easy signed and encrypted email

I'm frequently asked how to easily send email more securely. My default answer in the past was to point the person to PGP or GPG, with the caveat that neither was particularly user friendly and that it would require additional software.

That's not the case these days. Most major email clients include S/MIME support, and getting a certificate is pretty easy - for example, Thawte provides free personal email certificates which are easily imported into your client. From there, you can use the certificate to sign and/or encrypt your email. You'll need the person on the other end to get you their certificate to send encrypted mail to them, but again, modern email clients make this a snap. For most clients, simply have the remote user send you a signed email and click the "import" button when you see the icon for "signed mail" show up.

Pretty easy, right?

What's the difference between signing an email and encrypting it? Signing an email is a way of ensuring that the email is a) one you wrote and b) hasn't been changed along the way. A signed email says "This is from me, and it is exactly the message I wrote to you". An encrypted message is just that - encrypted so that it cannot be read except by people who have the right key to unlock it. Put them both together - a signed and encrypted message, and you have an email that you know came from a given person, you know it hasn't changed in content, and you know nobody else saw it along the way.

The truly paranoid security types would have me note that "you know that it came from a given person" is really "You decided to trust the certificate authority" and that "nobody else saw it" means "you can be reasonably certain that without heroic means using modern technology, nobody else can read the message". Digital signatures are highly dependent on trust models - the signature is only as good as the trust you can place in it.

If you know you want to do S/MIME, an easy way to get a S/MIME certificate from a trusted third party is to use Thawte's free personal email certificate. If you do use Thawte's free email certificate program, you can find notaries in most major metro areas, or you can catch them at your next security conference. A notary will verify that you are who you say you are for Thawte - basically a distributed trust model. If you're far away from any notary, there are alternate ways to get certified, albeit at a cost. If you get your certificate notarized for 50 points or more, you can put your name in your certificate. If you get to 100 points, you can become a notary. After that, the more people you notarize, the more points you will be able to give out - notaries max out at 35 points. In a reasonably sized organization, a handful of notaries can quickly become 35 point notaries and bootstrap new notaries easily.

I should tell you up front that Thawte will want what they call a "National ID number". I counsel people to use their driver's license number rather than their SSN here, as any notary who signs your certificate will need to keep that information on file.

Should your organization use signed or encrypted email? That's a matter of needs, policy, and practical considerations for implementation - and we'll talk about that another day. For now, if your question is "can I, as an individual, easily send signed or encrypted email easily", the answer is yes!

Monday, April 2, 2007

Security deals: eEye's Blink Personal for free

eEye is offering their Blink Personal Internet security product with antivirus with a 1 year license for free for a limited time.

According to an article on Dark Reading, the product will likely remain free after the one year period, as eEye is using consumer data to fuel their pro and commercial products.

"[Ross Brown, CEO of eEye] says the free consumer tool will also help eEye gather the data to build a better commercial product for its traditional business -- commercial and security-savvy users. The company is offering a free one-year subscription, but it doesn't intend to start charging for Blink Personal after the one year is up. "The renewal will probably be free again, too."
The software is worth a look for Windows users as an alternative to traditional standalone AV and firewall products - the product is built with a number of other capabilities such as host based IPS, system, and application firewalls in addition to AV and patch management.

Remember, free is an easy sell to friends and family who otherwise might go without quality security products. Software such as Zone Alarm and AVG's free personal use antivirus are on my frequently recommended list for people who can't or won't buy products.