Don’t Play Roulette with Your Tech!

The dangers of gambling with unsupported & unsecure software
When it comes to continued use of outdated software, are you putting your business in danger of losing vital data and information? Is your business a ticking time bomb ready to explode into a potential abyss of bankruptcy and going “out of business”?
If your business’s operating system software is well known for being out-of-date and unsecure, the answer is unequivocally yes. That circumstance for many has already come and gone in regard to Windows XP (retired April 8, 2014) and Windows Server 2003 (expired July 14, 2015).
Read more

6 Reasons Why Offsite Backup in the Cloud is 100% Critical to Save Your Business

Many times, businesses suffer not only the financial devastations caused by data loss, but they suffer the loss of client confidence. Both are equally devastating and determine whether a business can survive.
The industries that typically suffer the most frequent and expensive breaches are healthcare, financial, pharmaceuticals, transportation and communications. More than one-third of attacks are on businesses with fewer than 250 employees.
Read more

Dial Up VoIP to Drive Down Telecom Costs & Improve Productivity

Let’s face it. As a business owner, you’re always looking for ways to increase operating efficiencies and save money. And, technology is a great tool in which to accomplish those undertakings.

But, how do you go about getting it done?

An increasingly popular and effective approach is implementing Voice-over-Internet Protocol, or more commonly called VoIP. Basically, this system moves your business phone communications to your network connection. Consequently, it allows you and your team to access communication services from anywhere.

Yet, despite its growing use, many business owners are still unsure whether VoIP is the right choice. Does knowing you can reduce your telecom costs by as much as 30% help make the decision easier? Thought so.

Let’s take a look at some primary benefits of VoIP and see how well it fits your business needs.

Read more

What’s New in DataCore SANsymphony-V 10 PSP2

A few weeks ago, DataCore released Product Service Pack 2 (PSP2) for SANsymphony-V 10. As you know if you’ve followed this blog, DataCore is a leader in Software Defined Storage…in fact, DataCore was doing Software Defined Storage before it was called Software Defined Storage. PSP2 brings some great enhancements to the product in the areas of Cloud Integration, Platform Services, and Performance management:

  • OpenStack Integration - SANsymphony-V now understands the “Cinder” protocol used by OpenStack to provision storage from the OpenStack Horizon administrative interface. That means that from a single administrative interface, you can create a new VM on your OpenStack infrastructure, and provision and attach DataCore storage to that VM. But that’s not all - remember, SANsymphony-V can manage more than just the local storage on the DataCore nodes themselves. Because SANsymphony-V runs on a Windows Server OS, it can manage any storage that you can present to that Windows Server. Trying to figure out how to integrate your legacy SAN storage with your OpenStack deployment? Put SANsymphony-V in front of it, and let DataCore talk to OpenStack and provision the storage!
  • Random Write Accelerator - A heavy transactional workload that generates a lot of random write operations can be problematic for magnetic storage, because you have to wait while the disk heads are moved to the right track, then wait some more while the disk rotates to bring the right block under the head. So for truly random writes, the average latency will be the time it takes to move the heads halfway across the platters plus the time it takes the platters to make half a rotation. One of the benefits of SSDs, of course, is that you don’t have to wait for those things, because there are no spinning platters or heads to move. But SSDs are still pretty darned expensive compared to magnetic disks. By enabling random write acceleration on a volume, write requests will be fulfilled immediately by simply writing the data at the current (or nearest available) head/platter position, then having a “garbage collection” process that goes back later when things are not so busy and deletes the “dirty” blocks of data at the old locations. This can deliver SSD-like speed from spinning magnetic disks, and the only cost is that storage consumption will be somewhat increased by the old data that the garbage collection process hasn’t gotten around to yet.
  • Flash Memory Optimizations - Improvements have been made in how cache reads from PCIe flash cards in order to better utilize what can be a very costly resource.
  • Deduplication and Compression have been added as options that you can enable at the storage pool level.
  • Veeam Backup Integration - When you use Veeam to backup a vSphere environment, as many organizations do, the Veeam software typically triggers vSphere snapshots, which are retained for however long it takes to back up the VM in question. This adds to the load on the VMware hosts, and can slow down critical applications. With DataCore’s Veeam integration, DataCore snapshots are taken at the SAN level and used for the backup operation instead of using VMware snapshots.
  • VDI Services - DataCore has added specific support for highly available, stateful VDI deployments across clustered Hyper-V server pairs.
  • Centralized console for multiple groups - If you have multiple SANsymphony-V server groups distributed among, e.g., multiple branch offices, you no longer have to explicitly connect to each group in turn to manage it. All server groups can be integrated into the same management UI, with delegated, role-based adminstration to multiple individuals.
  • Expanded Instrumentation - SANsymphony-V now has tighter integration with S.M.A.R.T. alerts generated by the underlying physical storage to allow disk problems to be addressed before they become serious.

For more details on PSP2 and the features listed above, you can view the following 38-minute recording of a recent Webinar we presented on the subject:

You Can’t Afford to Ignore Windows 10 Anymore

Over the last several months, I’ve been watching the headlines about Windows 10 go by without really paying a lot of attention to them. Perhaps you have as well. Win10 fell into the category of “Things I Need to Look More Closely At When I Have Time.” After all, it hasn’t been that long since I upgraded to Windows 8.1. Then the news broke that “general availability” will be July 29, leading to one of those “Wait…what!?” moments. With the release now less than two months away, I realized I needed to make time.

So I bought another 8 Gb of RAM for my 64-bit Fujitsu laptop (blowing it out to a total of 12 Gb, woo hoo!), installed Client Hyper-V (which was amazingly easy to do in Windows 8.1 Enterprise), signed up for the Microsoft Insider Preview program, downloaded the Win10 ISO image, and built myself a Win10 VM.

My initial reaction is that it looks pretty good. The current preview build (10074) looks stable, seems to run everything that I’ve thrown at it, and my complaints are pretty minor. I can’t really test multimedia performance, as the preview build doesn’t have drivers that will allow audio pass-through from my Win10 VM to my host PC, but that’s not surprising at this point.

The Start menu is definitely a step in the right direction, but still doesn’t have that one piece of functionality that drove me to install Stardock Software’s Start8 utility: I love being able to click on the Start button, mouse up to, say, the Word or Excel icon, and immediately see the last several documents/spreadsheets I’ve opened, so I can jump directly to them. In Win10, if I pin, say, Word to my taskbar, I can right-click on the Word icon and see a list of recent files – but my personal preference is to reserve my taskbar for programs I’m actually running rather than taking up space with icons for programs I might want to run. Instead, I use the QuickLaunch toolbar for quick access to programs. (What – you didn’t know you could have a QuickLaunch toolbar in Windows 8.x? You can, and it works in the Win10 build I’m running as well, but that’s a subject for another post.) So, when Stardock releases a version for Win10, I’ll probably upgrade to it.

Speaking of upgrades, you’ve probably also heard that users who are running Windows 7, 8, or 8.1 will get a free upgrade to Windows 10. That’s true, depending on what version you’re currently running. There is an upgrade matrix at that tells you, for the Home and Pro editions, what version of Win10 you’ll get. And if you’re running Win7 SP1 or Win 8.1 S14, you can get the upgrade pushed to you via the Windows Update function.

Windows Enterprise users will not get free upgrades…apparently the rationale is that most Windows Enterprise users are part of, well, large enterprises that typically have a corporate license agreement with Microsoft that entitles them to OS upgrades anyway, and these enterprises also want to have tighter control over who gets what upgrade and when.

There are also a few other caveats to bear in mind. First, if you’re running Win7 SP1 or later, the chances are pretty good that your system will run Win10 without any problems…but “pretty good” doesn’t mean “guaranteed.” There’s a helpful article over on ZDNet that will walk you through how to find Microsoft’s compatibility-checking utility.

You may also be surprised at the things Windows 10 will remove from your system as part of the Win10 upgrade.

And bear in mind that if you just happily accept the automatic upgrade to Win10, you’re also opting in for all new features, security updates, and other fixes to the operating system for “the supported lifetime” of your PC. These will all be free, but you won’t have a choice as to which updates you do or don’t get – they’ll all be pushed to you via Windows Update. Businesses, whether running Windows Pro or Enterprise, will have more control over how and when new features and fixes roll out to their users, as Mary Jo Foley explains over on ZDNet.

Finally, Ed Bott is maintaining a great Win10 FAQ over on ZDNet that he’s been updating regularly as more information becomes available. You might want to bookmark that one and come back to it occasionally.

I confess that I’m kind of excited about the new release, and I’ll probably upgrade to it as soon as the Win10 Enterprise bits show up on our Microsoft Partner portal. It will be interesting to see how these major changes in how the Windows OS will be distributed and updated will play out over time. How about you? Feel free to share your thoughts in the comments below…

Trend Micro Releases Q1 Security Update Report

Trend Micro has released their 2015 Q1 Security Roundup report. It makes for some interesting reading. While we recommend that you click through and read the full report for yourself, here are some of the main points:

  • None of the prominent threats in the early part of 2015 were new…yet they were still effective. That suggests we still have a lot of work to do in educating our users about what not to do in order to stay safe online.
  • Malvertising” was a major issue. It’s particuarly dangerous in that it doesn’t require people to actually click on a link - the malware is downloaded when an online ad is displayed.
  • A lot of metaphorical rocks have been thrown at Apple over the (sometimes long and drawn-out) vetting process for getting an app listed in their app store. But the dangers of not having a thorough vetting process were demonstrated once again when mobile attackers were able to slip disguised adware into Google Play ™.
  • “Crypto-ransomware” infection counts increased by a factor of 5x - from 1,540 in 2014 Q1 to 7,844 in 2015 Q1. Some variants directly target enterprises by encrypting files in network shares.
  • Malware in the form of macros for Office apps also increased by almost a factor of 5x - from 19,842 in 2014 Q1 to 92,837 in 20115 Q1. This is an area that clearly demands more user training, as many infections were transmitted via email attachments where the recipients were instructed to enable macros in order to read the attachments.
  • Healthcare data is the “holy grail” of data theft, because it frequently includes social security numbers which are arguably much more valuable to a criminal than a credit card number that can only be used until the card is cancelled.

It’s still a dangerous world out there…surf safely, my friends!

Email Security & Archiving: Achieving Peace of Mind

Nearly 150 billion emails are sent daily. Close to 50% of business email users believe email reduces the day-to-day need for file storage. Now, you may account for only a small portion of that massive number. But, that said … is your data safe? Is downtime in your future?

Undoubtedly, many of those emails have attachments and sensitive information. Many are vulnerable to security breaches. Indeed, this is highly relevant to you. Especially if you share sensitive information.
• 88% of companies experience data loss; email’s the primary culprit
• 78% of employees access personal email from business computers; that’s double what’s authorized
• 68% of organizations currently don’t use a secure email service or email archiving solution

Nowadays, when it comes to stopping spam, securing email systems, combating rapidly evolving email threats, and keeping computers working at a fast pace, small and medium businesses (SMBs) just don’t have the resources for email protection.

What’s at risk is data loss and downtime. It creates financial and operational burdens that harm your business. So, it’s critical to keep up with continually evolving email security technologies and best practices.
It gives you peace of mind. Read more

MSPs Lead Businesses through the SaaS Maze

Countless businesses trust Managed Service Providers (MSPs) to deploy, manage and support infrastructure solutions both in the cloud and on-premise. The reason is simple. They are the best at eliminating downtime and providing superb user support.

Savvy businesses also recognize the value engaging their MSP in the search and support for the right applications and Software as a Service (SaaS) solutions for their business. Forrester Research says that IT partners aren’t just critical for the successful deployment of SaaS products, but the management aspect is a real value-add.

SaaS is software running on remote hardware that is owned, managed and delivered via the Internet on a pay-for-use or subscription basis. Without purchasing hardware or software, businesses just connect to the cloud. Deployment takes as little as a day.

Fixes and new features are regularly implemented. It does away with obsolescence. There’s also Service Level Agreements (SLAs) that guarantee availability. There’s no commitment beyond the subscription period, so risk is minimal.

SaaS apps are readily scalable. Capacity is on an on-demand basis. There’s no up-front capital expenditure. And, those using SaaS are “greener” because they share computing resources of the MSP.

When an MSP provides businesses with SaaS, it starts with Planning & Design. From there, the steps are Procurement, Deployment, Management and Governance. In essence, the MSP handles the entire service process.

Read more

Get Ready for the End of Your (Business) World

Windows Server 2003 Deadline Nears
Nearly a year ago Microsoft Windows XP support came to an end. Now, we are rapidly approaching Windows Server 2003, aka SBS 2003, end of life.

Are you ready? If you aren’t, what will you do when you wake up the morning of July 15, 2015 … the day after the end of your business world as you know it today? Despite Microsoft warning about end of life for Windows Server 2003 as early as two years before now, many small to medium-sized businesses have yet to begin their migration away from the server platform to a Windows server alternative. Worse yet, many of you are largely unaware of the huge financial costs and security risks if you continue running Windows Server 2003 past the end of life date.

Again, that date is July 14, 2015, when Microsoft will end extended support on all versions of Windows Server 2003/R2, according to the Microsoft Support Lifecycle section. Mark your calendar; set your alarm.
Read more

The Case for Office 365

Update - May 7, 2015
In the original post below, we talked about the 20,000 “item” limit in OneDrive for Business. It turns out that even our old friend and Office 365 evangelist Harry Brelsford, founder of SMB Nation, and, more recently, O365 Nation, has now run afoul of this obstacle, as he describes in his blog post from May 5.

Turns out there’s another quirk with OneDrive for Business that Harry didn’t touch on in his blog (nor did we in our original post below) - OneDrive for Business is really just a front end for a Microsoft hosted SharePoint server. “So what?” you say. Well, it turns out that there are several characters that are perfectly acceptable for you to use in a Windows file or folder name that are not acceptable in a file or folder name on a SharePoint server. (For the definitive list of what’s not acceptable, see And if you’re trying to sync thousands of files with your OneDrive for Business account and a few of them have illegal characters in their names, the sync operation will fail and you will get to play the “find-the-file-with-the-illegal-file-name” game, which can provide you with hours of fun…

Original Post Follows
A year ago, in a blog post targeted at prospective hosting providers, we said, “…in our opinion, selling Office 365 to your customers is not a cloud strategy. Office 365 may be a great fit for customers, but it still assumes that most computing will be done on a PC (or laptop) at the client endpoint, and your customer will still, in most cases, have at least one server to manage, backup, and repair when it breaks.”

About the same time, we wrote about the concept of “Data Gravity” - that, just as objects with physical mass exhibit inertia and attract one another in accordance with the law of gravity, large chunks of data also exhibit a kind of inertia and tend to attract other related data and the applications required to manipulate that data. This is due in part to the fact that (according to former Microsoft researcher Jim Gray) the most expensive part of computing is the cost of moving data around. It therefore makes sense that you should be running your applications wherever your data resides: if your data is in the Cloud, it can be argued that you should be running your applications there as well – especially apps that frequently have to access a shared set of back-end data.

Although these are still valid points, they do not imply that Office 365 can’t bring significant value to organizations of all sizes. There is a case to be made for Office 365, so let’s take a closer look at it:

First, Office 365 is, in most cases, the most cost-effective way to license the Office applications, especially if you have fewer than 300 users (which is the cut-off point between the “Business” and “Enterprise” O365 license plans). Consider that a volume license for Office 2013 Pro Plus without Software Assurance under the “Open Business” license plan costs roughly $500. The Office 365 Business plan – which gets you just the Office apps without the on-line services – costs $8.25/month. If you do the math, you’ll see that $500 would cover the subscription cost for five years.

But wait – that’s really not an apples-to-apples comparison, because with O365 you always have access to the latest version of Office. So we should really be comparing the O365 subscription cost to the volume license price of Office with Software Assurance, which, under the Open Business plan, is roughly $800 for the initial purchase, which includes two years of S.A., and $295 every two years after that to keep the S.A. in place. Total four-year cost under Open Business: $1,095. Total four-cost under the Office 365 Business plan: $396. Heck, even the Enterprise E3 plan (at $20/month) is only $960 over four years.

But (at the risk of sounding like a late-night cable TV commercial) that’s still not all! Office 365 allows each user to install the Office applications on up to five different PCs or Macs and up to five tablets and five smart phones. This is the closest Microsoft has ever come to per-user licensing for desktop applications, and in our increasingly mobile world where nearly everyone has multiple client devices, it’s an extremely attractive license model.

Second, at a price point that is still less than comparable volume licensing over a four-year period, you can also get Microsoft Hosted Exchange, Hosted SharePoint, OneDrive for Business, Hosted Lync for secure instant messaging and Web conferencing, and (depending on the plan) unlimited email archiving and eDiscovery tools such as the ability to put users and/or SharePoint document libraries on discovery hold and conduct global searches across your entire organization for relevant Exchange, Lync, and SharePoint data. This can make the value proposition even more compelling.

So what’s not to like?

Well, for one thing, email retention in Office 365 is not easy and intuitive. As we discussed in our recent blog series on eDiscovery, when an Outlook user empties the Deleted Items folder, or deletes a single item from it, or uses Shift+Delete on an item in another folder (which bypasses the Deleted Items folder), that item gets moved to the “Deletions” subfolder in a hidden “Recoverable Items” folder on the Exchange server. As the blog series explains, these items can still be retrieved by the user as long as they haven’t been purged. By default, they will be purged after two weeks. Microsoft’s Hosted Exchange service allows you to extend that period (the “Deleted Items Retention Period”), but only to a maximum of 30 days – whereas if you are running your own Exchange server, you can extend the period to several years.

But the same tools that allow a user to retrieve items from the Deletions subfolder will also allow a user to permanently purge items from that subfolder. And once an item is purged from the Deletions subfolder – whether explicitly by the user or by the expiration of the Deleted Items Retention Period – that item is gone forever. The only way to prevent this from happening is to put the user on Discovery Hold (assuming you’ve subscribed to a plan which allows you to put users on Discovery Hold), and, unfortunately, there is currently no way to do a bulk operation in O365 to put multiple users on Discovery Hold – you must laboriously do it one user at a time. And if you forget to do it when you create a new user, you run the risk of having that user’s email messages permanently deleted (whether accidentally or deliberately) with no ability to recover them if, Heaven forbid, you ever find yourself embroiled in an eDiscovery action.

One way around this is to couple your Office 365 plan with a third-party archiving tool, such as Mimecast. Although this obviously adds expense, it also adds another layer of malware filtering, an unlimited archive that the user cannot alter, a search function that integrates gracefully into Outlook, and an email continuity function that allows you to send/receive email directly via a Mimecast Web interface if the Office 365 Hosted Exchange service is ever unavailable. You can also use a tool like eFolder’s CloudFinder to back up your entire suite of Office 365 data – documents as well as email messages.

And then there’s OneDrive. You might be able, with a whole lot of business process re-engineering, to figure out how to move all of your file storage into Office 365’s Hosted SharePoint offering. Of course, there would then be no way to access those files unless you’re on-line. Hence the explosive growth in the business-class cloud file synchronization market - where you have a local folder (or multiple local folders) that automatically synchronizes with a cloud file repository, giving you the ability to work off-line and, provided you’ve saved your files in the right folder, synchronize those files to the cloud repository the next time you connect to the Internet. Microsoft’s entry in this field is OneDrive for Business…but there is a rather serious limitation in OneDrive for Business as it exists today.

O365’s 1 Tb of Cloud Storage per user sounds like more than you would ever need. But what you may not know is that there is a limit of 20,000 “items” per user (both a folder and a file within that folder are “items”). You’d be surprised at how fast you can reach that limit. For example, there are three folders on my laptop where all of my important work-related files are stored. One of those folders contains files that also need to be accessible by several other people in the organization. The aggregate storage consumed by those three folders is only about 5 Gb – but there are 18,333 files and subfolders in those three folders. If I was trying to use OneDrive for Business to synchronize all those files to the Cloud, I would probably be less than six months away from exceeding the 20,000 item limit.

Could I go through those folders and delete a lot of stuff I no longer need, or archive them off to, say, a USB drive? Sure I could – and I try to do that periodically. I dare say that you probably also have a lot of files hanging around on your systems that you no longer need. But it takes time to do that grooming – and what’s the most precious resource that most of us never have enough of? Yep, time. My solution is to use Citrix ShareFile to synchronize all three of those folders to a Cloud repository. We also offer Anchor Works (now owned by eFolder) for business-class Cloud file synchronization. (And there are good reasons why you might choose one over the other, but they’re beyond the scope of this article.)

The bottom line is that, while Office 365 still may not be a complete solution that will let you move your business entirely to the cloud and get out of the business of supporting on-prem servers, it can be a valuable component of a complete solution. As with so many things in IT, there is not necessarily a single “right” way to do anything. There are multiple approaches, each with pros and cons, and the challenge is to select the right combination of services for a particular business need. We believe that part of the value we can bring to the table is to help our clients select that right combination of services – whether it be a VirtualQube hosted private cloud, a private cloud on your own premise, in your own co-lo, or in a public infrastructure such as Amazon or Azure, or a public/private hybrid cloud deployment – and to help our clients determine whether one of the Office 365 plans should be part of that solution. And if you use the Office Suite at all, the answer to that is probably “yes” - it’s just a matter of which plan to choose.