What do you do about staff and personal laptops?

Today, one of our developers had his laptop stolen from his house. Apparently, he had a full svn checkout of the company's source code, as well as a full copy of the SQL database.

This is one massive reason why I'm personally against allowing company work on personal laptops.
However, even if this had been a company owned laptop, we'd still have the same problem, although we would be in a slightly stronger position to enforce encryption (WDE) on the whole disk.

Questions are these:

  1. What does your company do about company data on non company owned hardware?
  2. Is WDE a sensible solution? Does it produce a lot of overhead on reads/writes?
  3. Other than changing passwords for things that were stored/accessed from there, is there anything else you can suggest?

Solution 1:

  1. The problem is that allowing people do unpaid overtime on their own kit is very cheap, so managers aren't so willing to stop it; but will of course be happy to blame IT when there's a leak... Only a strongly enforced policy is going to prevent this. It's down to management where they want to strike the balance, but it's very much a people problem.

  2. I've tested WDE (Truecrypt) on laptops with admin-level workloads and it's really not that bad, performance-wise, the I/O hit is negligible. I've several developers keeping ~20GB working copies on it, too. It's not a 'solution' in itself; (It won't stop the data being slurped off an unsecured machine while it's booted, for instance), but it certainly closes a lot of doors.

  3. How about blanket ban on all externally held data; followed by some investment in remote desktop services, a decent VPN and the bandwidth to support it. That way all code stays inside the office; the users get a session with local network access to resources; and home machines just become dumb terminals. It won't suit all environments (intermittent access or high letency might be a deal-breaker in your case) but it's worth considering if home working is important to the company.

Solution 2:

Our company requires whole-disk encryption on all company-owned laptops. Sure, there's an overhead, but for most of our users this isn't an issue -- they're running web browsers and office suites. My MacBook is encrypted, and it hasn't really impacted things enough that I've noticed, even when running VMs under VirtualBox. For someone who spends much of their day compiling large trees of code it might be more of an issue.

You obviously need a policy framework for this sort of thing: you need to require that all company owned laptops are encrypted, and you need to require that company data cannot be stored on non-company owned equipment. Your policy needs to be enforced for technical and executive staff, too, even if they complain, otherwise you're just going to run into the same problem again.

Solution 3:

I would focus less on the equipment itself, and more on the data involved. This will help avoid the problems you're running into now. You may not have the leverage to mandate policy on personally owned equipment. However, you had better have the leverage to mandate how company owned data is handled. Being a university, we have issues like this come up all the time. Faculty may not be funded in such a way that their department is able to buy a computer, or they could buy a data processing server on a grant. In general, the solution to these problems is to protect the data, and not the hardware.

Does your organization have a Data Classification policy? If so, what does it say? How would the code repository be classified? What requirements would be placed on that category? If the answer to any of those is either "No" or "I don't know", then I would recommend talking to your Information Security office, or whomever in your organization is responsible for developing policies.

Based on what you say was released, were I the data owner I would likely classify it as High, or Code Red, or whatever your highest level is. Typically that would require encryption at rest, in transit, and may even list some restrictions on where the data is allowed to be housed.

Beyond that, you may be looking at implementing some secure programming practices. Something that might codify a development life cycle and expressly disallow developers from coming in contact with a production database except in weird, and rare, circumstances.

Solution 4:

1.) Working remotely

For developers, remote desktop is a very good solution unless 3D is required. The performance usually is good enough.

In my eyes, remote desktop is even safer than VPN, because an unlocked notebook with VPN active allows quite a bit more than a view to a terminal server would.

VPN should only be given to people who can prove they need more.

Moving sensitive data out of house is a no-go and should be prevented if possible. Working as a developer without internet access can be prohibited because the lack of access to source control, issue tracking, documentation systems and communications makes the efficiency iffy at best.

2.) Usage of non-company hardware in a network

A company should have a standard of what is required from hardware attached to the LAN:

  • Antivirus
  • Firewall
  • be in the domain, be inventarised
  • if mobile, be encrypted
  • users do not have local admin (difficult if developer, but doable)
  • etc.

Foreign hardware should either follow these guidelines or not be in the net. You could set up NAC to control that.

3.) Little can be done about the spilled milk, but steps can be taken to avoid reoccurence.

If the above steps are taken, and notebooks are little more than mobile thin clients, not much more is necessary. Hey, you can even buy cheap notebooks (or use old ones).