Posted by: mzuzarte | July 6, 2010

Wireless Configuration Fail

Wireless Network Fail

Wireless Network Fail

If this is your network, I think we need to talk…

Posted by: mzuzarte | July 6, 2010

Facebook apps starting to get really annoying.

Before, Facebook apps/invites were merely just somewhat annoying, people asking you to “like” different causes, attend pointless events and join groups.  Recently however, I’ve started to notice more maliciously driven invites, due to users assuming that whatever is on Facebook is secure, even though this isn’t the case.

For example, one Facebook friend of mine has been continuously trying to get me to join a group called “My girlfriend got me a free iPad, get yours too!”.  I decline the invite, but a few hours later notice that many of our common friends are inviting me to the same group.  What happened to all of the “internet street smarts” that people had developed over these past years?  Users had become immune to pop-up ads advertising winning a free iPod after clicking a monkey yet they click away when on the Facebook site.  THERE IS NO FREE IPAD!!!!!!!!

Anyhow, other then my disgust at my fellow Facebook users at letting these viral invite/app mechanisms to propagate, I have grown even more disgusted at the Facebook Social Networking Giant and letting this even happen on it’s systems.  Providers of such networks such as Facebook, Twitter, Myspace etc need to keep on top of their security, if they are able to keep a safe user environment I’m sure that their enrollment will increase, security people like me will stop bashing them and not just consist of children and teenagers looking to move up the ranks of social status.

Well it’s been quite awhile since my last blog post.  I found myself as being quite swamped over the past few months due to my upcoming nuptials, the closing of my new  home, vehicle issues and oh and did I mention WORK?  Anyhow, It’s been over a month since my last blog post about Human Vs Technical Vulnerabilities, that is apparently how long it takes me to get riled about and want to rant again.  My post today is about a similar topic, the difference between having security infrastructure in place and making sure it is configured to do what it is designed to do.

So you have a network you need to secure, good for you.  The terms “secure” and “network” caused a voice in your head to go and buy a firewall, even better, a firewall with IPS functionality, great.  You toss the devices in your network, spend some late nights trying to figure out how to configure PPPoE, a DHCP pool and maybe a VPN tunnel or two and after twelve or so runs to Tim Hortons all your inside hosts are able to access the internet, and be productive.  So you sit back, relax and have a beer… There is your mistake.

I had a brief conversation yesterday with a fellow Security Professional, he was investigating a “mid-sized” breach of a financial institution where they had invested almost a million dollars into network security, but had failed at maintaining a secure configuration.  This failure caused a breach which caused financial accounts to be compromised.

One can’t expect to just throw money at your security need and have it go away.  Someone with the knowledge of the firewall architecture and network setup needs to be involved in the configuration and maintenance of configuration of the devices.  Questions such as “How can this network be segmented?”, “What is the least privilege level necessary for specific users?”, “What holes need to be poked to the outside?” and “what VPN paramters should be used?” need to be asked and answered.

Don’t take on the burden of administering the security of your network yourself.  As much as this may sound as a shameless plug for my company, as firewall management is a big chunk of what we do.  It’s hard to explain how important and valuable the role of an MSP can play in the security of your network.  We use, configure and research firewalls and security devices on a day to day basis.  We know the tricks and quirks of different firewall models and vendors.  Do yourself and your company a favor, get someone in to help you out with your network before you are solely responsible for a major security breach.  “I didn’t know” isn’t an acceptable excuse.

Posted by: mzuzarte | June 3, 2010

Vulnerabilities: Human vs Technical

What a week it has been!  Scorching hot heat, pouring rain, vehicle trouble, no AC and having to use a loaner laptop.  Not too much fun at all.  Regardless, it’s a regular length week so it’s back to work as usual.

While doing some contract work for a customer, he began asking me some questions about network security, what it is I do and what other network administrators opt for in their network architecture.  We spoke about it briefly, but afterwards he said “well i’m glad I don’t have to worry about security issues, my network is as secure as it can be”.  Now I know what you are thinking, “no network is 100% secure” but i’ll agree, he was definitely on top of his network security.  His systems were locked down, patched regularly, physically secured in a server room which required two factor authentication etc.  He was however, missing one critical issue, the human vulnerability.

Over the past few  years, I’m sure you have noticed the increase in the number of phishing e-mails, misleading pop-ups and attempted malware infections.  This hows a change in the methods used by an attacker to compromise a system or steal data.  Systems are now more secure, managed by more skilled individuals (End-To-End Security Hint Hint) who keep on top of patches, vulnerabilities and holes.  Attackers have changed the direction of their attacks, focusing more on a user attack rather then a technical attack.  It’s far easier to gain access to a user’s machine on the inside network rather then crack a firewall, get around an IDS, and then crack a server while covering your tracks.  A non security educated user is more likely to click on the link that says “you bank account has been compromised, click here to change your password”.

Symantec recently released some stats stating that only 3 percent of malicious software their security products encounter exploit a technical vulnerability in the OS or network.  The 97 percent is in some way or form trying to social engineer the user into doing something they shouldn’t.  Refer back to my blog post about “scareware“.

This in turn introduces yet another set of products.  Content filtering (web, spam etc) as well as DLP solutions have come quite a ways since they were first introduced.  The vendor Websense stands out from the pack in particular, they have crafted some pretty interesting and effective solutions for protecting your end users from themselves.  Their most popular product is their web filtering service, but their web security, data security and e-mail security suites are all top notch.

In conclusion, it is the responsibility of the Network Security Admin not only to secure the network, but also to train their users into being more web smart and able to identify risks on the internet.  This is a critical part of the position, and one that you must not fail at!  If you need some help with it, you can always enlist the help of a variety of security suites to help enforce internet policies.  If you need any help with navigating the product lines, feel free to drop me an e-mail, and I’ll get you on the right track.

Posted by: mzuzarte | May 28, 2010

Hacktivism and Cyber-Terrorism

Ah finally Friday, it’s been quite a busy week, I feel like rather then having a shortened week, I had to cram 5 days of work into four.  Anyhow, a news story caught my eye this week that I felt like blogging about.  In South Florida, someone managed to hack a digital road work sign to read “No Latinos, No Tacos”… or is it “No Tacos, No Latinos”?  Unfortunately, it was a bit hard to tell what statement the sign was trying to make.  The reason for the hacked sign was obviously in response to the passing Senate Bill 1070 in Arizona, but I won’t get into that, too political for me.

Hacked Road Sign

Hacked Road Sign

So the word of the day is “Hacktivism”, compromising a system to make a politcal or moral statement.  As Wikipedia does such a great job of defining it, I would be doing a disservice to you not to quote them:

Hacktivism (a portmanteau of hack and activism) is “the nonviolent use of illegal or legally ambiguous digital tools in pursuit of political ends. These tools include web site defacements, redirects, denial-of-service attacks, information theft, web site parodies, virtual sit-ins, virtual sabotage, and software development.” It is often understood as the writing of code to promote political ideology – promoting expressive politics, free speech, human rights, or information ethics. Acts of hacktivism are carried out in the belief that proper use of code will be able to produce similar results to those produced by regular activism or civil disobedience.

The saying “A terrorist to one man is a freedom fighter to another” plays out in the hacktivism world as well.  The means used by a hactivist to spread their message may affect a user’s access to resources or cause system downtime.  This causes some to view “hacktivism” as “cyber-terrorism”.

In February of 2010 the hactivist group known as “anonymous” launched a DoS attack against the Australian government due to their attempt to filter what content came in and out of the country.  While doing this the group caused downtime to some users and government workers, this in turn got them branded as cyber-terrorists by some groups.

Wether it be called Hacktivism or Cyber-Terrorism, the core concept is the same, compromising a system to further a cause, message or ideal.  Regardless of the cause, the job of a Ethical Security Professional is to keep the network that you are responsible for secure.  Governments and Political Organizations are prime targets for these kinds of attacks.  It’s up to the Security Professionals under their employ to keep their network up and keep the Triad of Security as their Goal (Confidentiality, Integrity, Availability).

Posted by: mzuzarte | May 18, 2010

Lack of knowledge is not an excuse!

Network security is a very unforgiving beast.  Forgetting to close a hole, change that password or patch that vulnerability can easily cost you in leaking confidential information, or the hijacking of your network.  Unfortunately, I’ve noticed many trying to hide behind the cloak of ignorance, saying that they secured their network to the “best of their ability”.  This isn’t 10th grade where your geography teacher gave you some extra marks because you “tried your best”.  A hacker on the other end of the wire doesn’t have pity on you.

What spawned this conversation was a candid conversation I had with an employee of another Managed Service Provider who only deals with end user equipment (desktops, printers, small servers).  He was asking me some questions about security best practices and firewalls.  One of his customers asked him if he could configure a small server and a linksys firewall for their company, he accepted, but wasn’t entirely sure about what needed to be done in terms of security.  I went through a list of points in my mind of aspects of the network he would have to review, he had already setup the network and was blown away at the number of points he had missed.  He said “well, I worked with what I had”…

Statistics show that 38% of all compromised systems were mis-configured, which was the main reason why the system fell victim to an attack.  I personally find that number to be quite conservative, I would think the percentage to be a fair bit higher.  I’m not sure if they took into account system design flaws, such as improper network segmentation, zone policies, or user access.  I would consider the above to be a mis-configuration on the part of the designer/administrator.

Regardless, if you feel that you need someone to give you some advice, or take a second look at your network, don’t be afraid to ask for help.  There are mahy resources on the internet that one could use to look for advice and tips.  If you need a more personal touch, contact an MSP such as my company  We’d be more then happy to help you out.

Posted by: mzuzarte | May 10, 2010

Does your network have a WIFI security breach?

Information Leaks are a major problem in today’s corporate networks, more so then people think, or want to think about.  One way information can leak is by a user installing a wireless access point on the LAN, effectively allowing anyone capable of connecting access to the corporate network.  These access points are referred to as “Rogue Access Points”.  The user who installs it could be unaware of the consequences of installing the device, they may just want wireless internet access provided to their mobile device , or they could have purposely installed the device to allow a black hat hacker access to the network remotely (outside the building in a car).

End To End offers a Rogue AP scan service where a tech comes out on intervals to scan the network for unauthorized access points.  Pretty much what the tech has to do is walk around the company prem (and outside as well) scanning the nearby wireless networks.  At each point the tech takes a capture that has the stated signal strength for each available network.  Afterwards the tech can compile this data and make an educated guess as to where exactly the access point is located (the stronger the signal, the closer the tech is to the access point).  If an AP’s signal strength is found to be close to company prem, this should sound off some alarms, as the AP could be located on site and be connected to the corporate LAN.

I used to use two applications for wireless scanning, one was Kismet, available for most Linux distros, but can be a pain to save data and Linux does not always run well on recently released laptop hardware and wireless cards.  The other app is called Netstumbler, which worked quite well for the purpose, but as with most freeware/begware programs, development is behind and it includes no support for Windows 7.  With these issues bugging me and a wireless audit scheduled for the near future, I set out to find some new scanning applications.  Two graphical applications caught my eye, and both are supported under Windows 7.  Mind you I’ve only been able to play with these apps for a limited amount of time but here are my initial observations.

The first is called inSSIDer and out of the two it’s the one I like the best… thus far.  It’s a visually appealing interface which could look good in reports, and it’s very easy to use.  However I found some issues when trying to save a capture, it saved the file, but then refused to open it afterwards.  After some fiddling it opened the file but the graphs did not appear properly and some information seemed to be missing.  Regardless when the scanning was being performed data appeared quite fast and I like the graph which displays changes in signal strength, this could be very useful when troubleshooting wireless connectivity issues.  This app was also able to find two APs that the other app was unable to find, even though the RSSI was -100 meaning that the AP was totally out of range, it was nice to see that it was able to pick it up.


inSSIDer screenshot

The second application is called Vistumbler, which is much less visually appealing but presents you with way more data right off the bat.  There is no pretty graph or colour coded lines, but the simplistic interface and fast operation reminded me alot of NetStumbler.  From what I read it also does a better job of GPS AP location then inSSIDer.  However inSSIDer did manage to find two APs that Vistumbler did not detect.  I guess more testing is in order to really determine which one of these apps are a better fit.


Vistumbler screenshot

Give these apps a shot and scan your wireless network.  See if there is anything around that surprises you, if you have any questions, feel free to give me a shout.  I also wouldn’t mind hearing some feedback about the pros/cons of the above apps.  Here are some links for downloads:

NetStumbler –

Kismet – Check your Linux repository

inSSIDer –

Vistumbler –

While doing my usual security research, I stumbled across the following article:

I was a little surprised to see that so many firms still give user’s admin access to workstations.  I  understand that there are special situations where admin rights are required (laptops etc) but even in those cases I believe that users should be able to have two accounts, one with regular user access and one with admin rights only to be used for making authorized changes to the machine.

In Windows 7 particularly, Microsoft has done a great job of giving the user an option to login with admin rights just to perform a specific function.  For example, Mary is working on her machine but needs to install a plugin, Mary is already educated on how to tell if a installer is legitimate so she knows that the package is safe.  She clicks the install button but does not have admin rights.  Windows then prompts her to login with her admin account to install the software.  The operation is almost as seamless as that of Ubuntu Linux, which has no root account, and you must perform admin actions by using the “sudo” command.

Anyhow, if your firm is the type who hands out Admin rights to users, read the above article and rethink your strategy.

Posted by: mzuzarte | May 7, 2010

What ever happened to Anti-Virus software?

Well it’s been awhile since I’ve had a chance to blog about anything!  House purchasing, wedding planning and working and tons of security research have kept me busy.  However while on a long subway ride from downtown Toronto yesterday, I started thinking about the types of protection our networks and systems had back in the day, before I was even involved in the IT industry.

One thing that everyone was talking about years ago was Anti-Virus software.  Norton, McAfee and a few others dominated the market, and everyone was always terrified about getting a virus, rendering their systems unusable  or leaking important company data to the internet.  Remember the Sasser worm from the early 2000’s?  The worm was able to propagate from machine to machine without any user interaction, it utilized a vulnerability in Microsoft’s Remote Procedure Call to cause machines to reboot.  Not fun.  Anti-Virus vendors and Microsoft scrambled to release patches and virus removal tools to help mitigate the effects of the worm.

Between then and now, anti-virus software began to lose popularity.  Companies have opted to spend their precious IT budget dollars on equipment that can help mitigate the effects of attacks and limit their effect on an organization. IDS/IPS units are able to download signatures on a frequent basis setup by the user, firewalls have become more granular with restriction, and able to integrate other filtering services to block attacks and malware.

Due to the continuing evolution/increase of threats and their complexity, security vendors are ever scrambling to push the limits of what they can provide to secure the network.  However I can’t say that Anti-Virus has shared the drive of competing solutions, it has been unable to protect against the last few major attack outbreaks and frankly, many people have not been surprised.

Now it comes down to the IT department, some still hold true to their beliefs that Anti-Virus protection will save them when the time comes, but many have already decided to go with cheaper Anti-Virus solutions and start beefing up the protection they have at the lower levels of the OSI model.  I strongly suggest that organizations start rethinking their stance on security, research the new tools out there, many are low cost or even free.  Also look into new products that can be implemented to help with security management.  Last but not least, educate the end user, even if it’s against their will and they resist, kicking and screaming.  Information Security is everyone’s responsibility.

Posted by: mzuzarte | April 9, 2010

Juniper SRX Firewalls

Yesterday I had the privilege of attending a training session put on by Juniper regarding their SRX firewalls.  For those of you who don’t know, sometime in the distant future Juniper plans to discontinue their ScreenOS based SSG firewalls and replace them with SRX firewalls powered by JUNOS…

Alright here’s some historical background.  Once upon a time there was a firewall called Netscreen, this firewall ran an operating system called ScreenOS.  Eventually Juniper Networks bought Netscreen and after some time, launched the Juniper SSG firewall ditching the Netscreen name.  The SSG ran ScreenOS as well.  Now Juniper also sells routers and switches, they run an OS created by Juniper itself called JUNOS.  The SRX effectively discontinues ScreenOS and puts JUNOS into the firewall market.

Anyhow, back on topic.  The training session was only about 8 people from a few MSP firms in the Toronto area.  We were all totally new to JUNOS so the training session started us from scratch.  An overview of the system architecture, followed by some instructor led labs were on the menu, oh and speaking of menu, Boston Pizza delivery was excellent!  It’s always good when you are fed well at these training seminars.

Well enough babbling, here are some points that I made about the advantages and disadvantages of the SRX.  Note that many of the disadvantages are temporary, the SRX is still being developed and most of the issues should be ironed out in the near future.

The Good:

  • Conservatively speaking, ScreenOS should be around for the next 6 years.  This may be longer if JUNOS in the SRX runs into issues.  Steve admitted that it is currently more stable and developed then SRX currently is.
  • The SRX 210 has a whopping 1gb of flash!
  • Changes made to the SRX aren’t put into action as soon as you hit enter.  After changes are made a “commit” must be done to save changes and put them into the running configuration.
  • Unit checks changes to ensure config makes sense before committing.  Otherwise an error is returned and config is not committed.
  • The SRX makes a backup of the current config when you perform a “commit”, this way if the changes you made don’t work, you can roll back to the last config.  You can have up to 49 rollback points.  You can also setup the SRX ftp the backup config to a server everytime you perform a commit.
  • You can create a rescue config save point.  So that  when you default a box, it goes to the Rescue config rather than the factory default.
  • Can do pattern search and replace commands in the config “ex replace with” without having to manually scour the config.
  • Can accept 3G express cards.

The Bad:

  • No current support for SSL VPN (yet).
  • Tunnels must terminate in the main routing table, not in a custom VR.  Could create issues with access control.
  • Configuration is a bit cumbersome at first.  Commands are long winded and wordy, but the autocomplete works very well!
  • No wireless (yet).
  • With a root login, you are able to access the underlying BSD OS.  This can be very dangerous in the hands of a inexperienced user, but also has its advantages (can perform administrative functions that are  unavailable in most other vendor devices).

Older Posts »