August 25, 2016
By Dan Rubins
Reading time: 4 minutes

In spite of developers’ best efforts, technology always has security flaws. Legal firms tend not to talk about their data breaches, but the confidential information they hold makes them attractive targets. Intruders who gain information can use it for business espionage or blackmail, which makes security especially concerning for law firms and legal tech vendors.

Earlier this year, some large US law firms suffered major data breaches. The FBI and the Manhattan US Attorney are now investigating them. Little information is available so far on their extent, but these events call attention to the need for strong security and some deep soul searching among technology vendors and law firms alike.

Shortly after the US law firm breeches were revealed, the biggest leak in history was reported by the International Consortium of Investigative Journalists, revealing global corruption on a truly staggering scale, all mediated by a law firm based in Panama.

The sheer quantity of leaked data greatly exceeds the WikiLeaks Cablegate leak in 2010 (1.7 GB), Offshore Leaks in 2013 (260 GB), the 2014 Lux Leaks (4 GB), and the 3.3 GB Swiss Leaks of 2015. For comparison, the 2.6 TB of the Panama Papers equals 2,600 GB.
“About the Panama Papers”. Süddeutsche Zeitung.

The Legal Cloud Computing Association (LCCA) has published standards for data security in the legal world in collaboration with a few leading vendors, law societies, and bar associations. We’re big fans of the work the LCCA has done, but it’s not enough. These standards serve as an excellent starting point, but few would say they comprise a comprehensive security policy.

On Ethical Hacking, unfortunately the LCCA’s standads stop at simply disclosing how often a vendor engages ethical hackers. However, continuous engagement with ethical hackers - and compensating them for their work is increasingly an important security practice. Major corporations such as Facebook and Google, and (finally) Apple have instituted “bug bounty” programs, rewarding people with money for reporting security issues before malicious parties can exploit them. Google has paid out over $4,000,000 in bounties since 2010.

Bounty hunters, also called “white hats” or “ethical hackers,” probe servers for weaknesses and file reports. If they’re the first, and the company accepts their report, they get a nice reward. To qualify, they have to describe the bug in clear terms, demonstrate how to replicate it, and explain what consequences follow.

Like Hippocrates, they’re expected to follow the rule, “First do no harm.” They need to show that they could do damage without actually doing any. For instance, if they’re showing that they can make a website run a script of their choosing, they should run one that just says “Hi, I’m a script running on your website.” They should report the bug only to the site owner, at least until it’s obvious that the owner is ignoring the problem and they have to take it to a higher level.

In our experience, most of these people are security professionals looking for fun or a little extra money. Some make a full-time living of bug hunting.

Certain types of bugs are common on websites, and bounty hunters will try these, either manually or with the help of software tools.

  • Invalid input may make a website or application misbehave. Entering an impossible date or typing letters where a form expects numbers may lead to an error state that exposes data to the user.
  • Unsanitized inputs can let a user filling in a form give instructions to the database, such as deleting or retrieving data. Safe form processing code “sanitizes” the entries by removing or altering any text that an SQL processor could interpret as instructions rather than data.
  • Cross-site scripting makes the site execute arbitrary JavaScript. If a user can submit content such as comments, the site’s code needs to remove any executable JavaScript before displaying it.
  • Dumpster diving is looking at a browser’s history on a public computer for possible exploits. Sometimes it lets a user revive a previously logged out session or reveals in excessive detail what the user did.
  • URL guessing may let a user discover files that are on the website but not intended for public viewing. In badly configured sites, these may contain passwords or other sensitive information.
  • Traditionally, bug bounties have been for large companies because of the difficulty of running the program. It’s necessary to go through a lot of badly written reports to find a real bug, and too low a reward may merely invite people to experiment on the site without attracting the kind of people who’ll find real bugs and report them properly. Lately, though, service companies such as HackerOne have offered cloud-based bug bounty systems that are affordable and provide access to a talented pool of ethical hackers.

The benefits of a well-run bounty program can be considerable. It’s cheaper to pay a reasonable reward for the early detection of security holes than to fix the problem and deal with possible liability after someone uses them maliciously. It’s also the right thing to do for customers. For this reason, we encourage the LCCA to include bug bounty programs as a standard in the next revision.

Since the very first version of our product was available for our own internal testing, we at Legal Robot have run an invite-only ethical hacking and bug bounty program through HackerOne. Today, we are launching this program publicly. Any ethical hacker, anywhere in the world, that follows our program rules can receive compensation for finding and responsibly reporting security issues. We are also making a public commitment to meet or exceed the LCCA’s Security Standards and hope to join the organization as a member in due course.

We also encourage fellow Legal Tech companies to implement bug bounty programs. If you’re an executive at a Legal Tech company and have questions, please reach out to us at [email protected]. This issue is too important for our industry to ignore any longer.



blog comments powered by Disqus