Save money on Amazon EC2 hosting

Saving Money On Amazon EC2 and Other Cloud Hosts

With everything being “in the Cloud” these days, many people are moving their hosted websites away from static hardware and to a Cloud-based hosting solution – either a shared service or their own server hosted by a provider. We love Amazon’s EC2 infrastructure and while using it for most of our services and offering it as a great solution for our clients it can quickly become costly. There are a few simple things you can do to substantially reduce your monthly costs. Here are some ways you can be saving money on Amazon EC2 and other Cloud providers:

Read more

Is the NSA Spying On You? They Might Be…

With all the news lately about the NSA using everything from Angry Birds, to your Xbox, to your Yahoo! accounts you may be concerned that the NSA or someone else is spying on you and your data. Unfortunately, unless you are using the right security tools to monitor what’s happening on your computer or phone then you may never know. Also, any data you store in the Cloud or interact with in the Cloud can also be used to spy on you.

The best way to protect yourself from prying eyes is to use inbound and outbound monitoring software on your computer to see who or what is connecting to your computer and to see what your computer is sending back out to the Internet. On the Mac we like to use Little Snitch, and on our Windows PCs we use Zone Alarm or NetLimiter. As far as personal or private data you have in the Cloud there isn’t really much you can do about that because the Cloud provider or service provider where your data is hosted can provide access to organization with the right credentials who want your data. On your game console or phone things become much more complicated as you’ll have to use some very specialized software and tools to monitor. We’ll cover more tools for phones and game consoles in another article.

Read more

Which CMS Should I Use?

WordPress, Drupal, and Joomla! Which CMS Should I Use?

It was only time that kept bloggers around the world from whipping out their laptops to write some long-winded comparison between the three major Content Management Systems taking over the CMS industry. The three most popular CMS systems by far are WordPress, Joomla and Drupal. Now that these systems are becoming more and more integrated into the business world, people everywhere are breaking out their laptops and logging into their blogs to share how they feel about these Content Management Systems. Browsing the web, users can find WordPress loyalists, Drupal enthusiasts and Joomla! fanatics. The truth? All of the CMS have their pros and their cons. Which CMS should I use?

WordPress can be considered the Content Management System late to the game, but it surely is the underdog, as well as the go to CMS for bloggers. WordPress is an excellent system to create a quick and professional looking website, but while it is often used as a blog, it can be used to work in many other interesting ways as well.

Drupal is a content managers dream come true. If you’re the type of person who would rather hand-code the content of your pages, a CMS like WordPress will be nothing more than a disappointment. Drupal on the other hand allows for developers to completely customize websites, blogs and more through their completely open platform designed for the code-friendly content manager. Another benefit of Drupal is that it has a huge community support group for those who may not be so familiar with coding, so experienced content managers can help new users get acquainted to the site.

Joomla! is more like Drupal than WordPress. Joomla! is a content management system designed with the developer in mind. Many call Joomla! an excellent combination of WordPress and Drupal. The name Joomla, in fact, means ‘all together’ in Swahili (Urdu), and it appears that they have been living up to their name in how this powerful CMS works. Designers will choose Joomla because of the amazing capabilities that its engine has in making websites look fantastic. Newbies will enjoy Joomla! because of its user-friendly features and wide community support network.

ios 7 flaws

Apple Security Issues Continue With New iOS 7 Flaws

Apple really needs some help with security testing as part of its release cycle for new software. In particular, iOS 7 flaws continue to show up that should have been caught prior to release. The latest issue is that anyone can bypass the iPhone’s lockscreen to hijack photos, email, or Twitter. Also, there’s a bounty for hackers to crack the fingerprint “Touch ID”. Some security analysts claim that this will be a near impossible feat, but my money is on the hackers.

So, why do we hear of so many instances where a company, often a large one, has released a new product or service that has gone through their QA team only to have the public and their customers find some serious security issues and bugs? Well, it’s partly the “see the forest for the trees issue” that’s at the heart of the problem. You and your team are often too close to your products to formulate enough of a unique perspective. You see, no matter how sophisticated of a team you have, the processes you have in place, the tools you use, the test cases you use, you are still limited and myopic to a certain degree by those same sets of tools, processes, people. Whatever your QA process is it will still be dictated by the definition of that very process. If that process has a flaw in it then chances are good you’ll release software that your customers will find issues with.

So, what can you do to increase your odds of finding issues in-house before your software leaves the lab? Secondary QA testing, by a professional software quality assurance company will help. This means to bring into your release process a separate QA team that employs different processes than what you use and to use a team that isn’t as familiar with your product or service as you are and are not already “tainted” with how it should work, look, and feel. Our company, QuadrixIT, is often used by other software development firms to employ a fresh look and fresh test perspective to supplement their in-house test teams and processes. More times than not we’re able to report back to our clients new bugs – both in functionality, feel, and security that they completely missed.

Some companies further try to reduce bugs by implementing a beta test cycle where they have some of their customers test their software after their QA team has completed testing or near the tail-end of the development cycle. While this should certainly be standard protocol, beta testing does not replace the expertise of having a secondary QA team employ their own test processes, tools, and methodologies. If you can utilize a secondary QA team that uses completely different test methodologies (albeit some form of accepted QA processes) than what your team uses, you’ll be that much more assured that those fresh eyes will find issues that your team may not have caught.

Apple and other companies can try to keep the QA team and the development team separate enough from each other to emulate this “freshness”, but in practice that doesn’t work well in the corporate world. Corporate process often dictates that the model is usually one of a shared expertise so that resources can be cross-utilized between teams to meet project timelines and release/development cycles. This close-team integration might be positive for product design and sharing of tools, etc. but it is a sure way to corrupt that fresh perspective when testing. Apple certainly could have benefited by having a secondary QA team review iOS 7, potentially finding these flaws prior to release.

Edward Snowden

Can the Next Edward Snowden Be Stopped?

Whether you view Edward Snowden, the former CIA employee who blew the doors open at the ultra secret NSA, as patriot or as spy, you can be assured that the CIA and NSA will take strong measures to put the lid back on how they do business.

The director of the NSA, Gen. Keith B. Alexander, says that his agency will institute a two-man rule, similar to what is used on submarines where both the commanding officer and executive officer must agree that the order to launch is valid. The idea is that it will take two people to gain access to server rooms and that System Administrators (SysAdmin) will be paired together when accessing sensitive intelligence. Also, access to data will be limited by not storing as much on a single server.

However, with the job of a SysAdmin how will the two-man rule be implemented and will it truly be effective? Many organizations protect their secrets by using ‘security by design’, where the software or systems have been designed from the ground up to be secure, or by ‘security through obscurity’, where secrecy of design or implementation provide the security.

Effective security is not just about the technology managing the secrets, but more importantly the management of the people who hold those secrets. The problem is that the role of SysAdmin is one of access to the systems and they are usually the ones who hold the keys to the kingdom. With a two-man authentication system the NSA will certainly undergo a slowdown in the amount of data they’ll be able to review since approvals for both the ingress and egress of that data and its systems must be done in tandem. Also, with the advent of even bigger Big Data and Cloud-based data solutions the problem becomes exponentially more difficult to manage.

How can a SysAdmin, who by the nature of the job has access to enormous amounts of sensitive information be regulated and controlled? To start, the NSA has said they’ll be cutting SysAdmins by 90% to limit data access. Gen. Alexander has said that “what we’ve done is we’ve put people in the loop of transferring data, securing networks and doing things that machines are probably better at doing.” Using technology to automate the work done by employees and contractors would make the NSA’s networks “more defensible and more secure,” Gen. Alexander said at a cybersecurity conference in New York City.

Regardless of the security technologies implemented, security processes in place, and the systems to protect the release of those secrets, security will still boil down to the trust of the people who control those systems. What’s to stop a person who manages the NSA’s new control systems from releasing those secrets? Will the next Snowden be the person who manages those control systems or the person who wrote the software that manages those controls? Implementing the two-man rule, reducing the number of people with access, and bringing in new control systems will help the NSA, but it will come at a high cost in efficiency. The solution to not having another Snowden actually lies not only in the security processes put in place to protect the secrets, but in the most simple part of the equation – ensuring that the to be hired analyst undergoes more stringent interviews, background checks, and ongoing recertification upon hire. It turns out that Booz Allen Hamilton, the firm that hired Snowden as a subcontractor, had concerns when finding discrepancies in his resume, though they still hired him.

It used to be that when someone joined the military and applied for a secret or top secret clearance not only would they be interviewed by the FBI and the hiring branch of service, but so would their friends, and their friend’s friends. That hiring and approval process was very exhaustive. A good start to avoiding another Snowden would be to tighten up the interview and background checking process. Having subcontracted firms be responsible for the approval process of hiring prospective NSA personnel is not the most effective method for weeding out poor candidates. All potential NSA personnel should go through extensive checks beyond what a subcontracted company can provide and that responsibility should be given back exclusively to the government. Strength in security must start at the individual that is hired, and not only be reliant on the systems in place.

Joseph Gutwirth, Founder and CEO QuadrixIT

Setting Up a Business Network May Not Be Enough

This is all too common. Most companies want to focus on their company’s objectives and not their network infrastructure. While this can seem to be the simplest thing to do – get things set up, start using, and don’t worry about it again until there’s an issue, this can be one of the most costly decisions you can make when setting up a business network.

With so many diverse technologies (operating systems, application solutions, email solutions, intranet, extranet, web sites, SSL, etc.) it can be intimidating to most people how to best maintain such a diaspora of technologies. Many companies can’t afford to have a full time IT team let alone a full-time IT person on staff. Some companies who do have full-time IT staff may be focusing on working on projects within the company and not on keeping the company’s infrastructure solid, secure, and reliable.

Leaving your company’s infrastructure alone until there’s an issue will at some point in time open you up to security issues, quality issues, and performance issues. Most software, firmware, and hardware have critical updates that are available over the life of the product. Operating systems and applications are particularly prone to security issues and must be updated in some reasonable time cycle to allow for as much protection as possible. Of course new versions can also introduce new problems or can potentially introduce issues when none were present. This is another reason why it’s important for a company to have a process in place for retention of data, including prior operating systems, system versions, etc. Many critical network devices and appliances routinely have firmware updates that can affect security, quality, reliability, and performance. Routers, hubs, switches, wireless devices, all need to be maintained to ensure that the most recent bona fide releases are installed and functioning nominally.

If your company does not have budget for IT staff to manager your network, information security, and reliability of your systems then you may want to consider hiring a reputable firm to routinely perform maintenance. Typical maintenance would include testing performance speeds on your network, sniffing your network for potential intrusions, analyzing workstations and their applicable applications and email systems to ensure proper function and no security breaches. It’s not uncommon for a company which has been compromised to not even be aware that they’ve been compromised. This can result in intentional performance degradation from a competitor, or other malicious user, or can result in even more nefarious compromises such as theft of data or intentional disruption of services and critical infrastructure.
If you don’t keep your systems up-to-date then it’s only a matter of time before you can lose critical systems which are required to keep your day-to-day business going. Consider hiring a reputable firm, such as us, to come in and assess the status of infrastructure and propose methods to keep your business in business.

Contact Quadrix for a free network and security assessment to ensure your company’s business continuity.