The Hostway Blog

Is Anything Private Online?

Protecting your information on the Internet seems to be getting more difficult each day. You feel like you’ve been very careful about how and when you post something like your email address and yet, your inbox is full to the brim with spam. Can you really publish information on the Web and have it remain private?

How to Protect Your Email Address

The most common piece of information we share on the Web is our email address. Even if you’re careful, it can fall into the wrong hands. And, as a business, your email address has to be accessible to the public. There are, however, some things you can do to try slow the spammers and phishers down.

Most email harvesting is done by automated software programs. You can trip them up a little by employing a few simple techniques.

  • At or dot: Forgo the @ and . You can post your email address as JohnSmith at example.com or JohnSmith at example dot com. The downside is that it isn’t the most professional way to present yourself or your company and many of the email harvesting programs are smart enough to adjust.
  • Post it as an image: Another common ploy is using an image instead of text. You can create a .jpeg of your email address in lieu of text. But, some spammers have programs that can read images.
  • Email encoder: Email address encoders transform your email address into a code that helps thwart the harvesters. You can also use javascript to encode your raw email address.
  • CAPTCHA: One of the most effective ways to protect your email address is by using a CAPTCHA. CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart). These challenge tests display images with squiggly letters and numbers that humans can decipher, but are difficult for computers to read. reCAPTCHA’s MailHide is a free service that will generate a code to protect your email address.

No Indexing, Please

You can take steps to protect pages of your Web site from being indexed and therefore more accessible to harvesters, but most of these methods only keep the good bots out.

Creating a Robots Exclusion Protocol with a /robots.txt document will tell the bots what not to crawl, but it won’t stop the naughty robots. The same is true of using a noindex meta tag; it really only keeps the good guys out.

The best way to protect your Web pages from prying eyes is to use some form of password protection. This gives you complete control over who sees what. There are server side and client side based solutions. If you’re not sure where to start, talk with your Web host.