In the last several years, companies have accelerated their cloud adoption and have invested time and resources to lift and shift their content, development and applications to public and private clouds. The onset of the global health crisis has further accelerated even the more traditional brick-and-mortar companies to invest in cloud technologies. Yet, we still see customers hosting content on on-premises repositories in spite of inexpensive per-GB cloud storage. Why is that?

Traditional pain points

Cloud technologies enable users to access their files from anywhere as long as they have robust internet access. However, many applications still require minimum latency (response time) and quick access to these files. For example, consider a user accessing a large AutoCAD file having many file references associated with it. The user’s application works best if these files are hosted on the local drive or on a network drive letter within the company’s firewall. Customers in this scenario ask for an on-premises server that hosts a copy of the cloud file to avoid taking up users’ local drives with these large files. 

Many customers would like to have a fallback in scenarios where they experience internet outages that disrupt users’ access to their business files via the cloud. In certain scenarios, places like remote offices or construction job sites do not have consistently good internet connectivity. Customers with regional offices look to centralize content in the cloud and deploy on-premises servers that replicate the content between cloud and to the other offices.

These have been the traditional problems that hybrid solutions have solved over the years. But the onset of the pandemic has brought with it a structural change in the way companies run their business, and a need to adapt to the “the new normal.”

People working from home and going to work occasionally

While there’s been a lot of talk of permanent work from home as “the future of work,” we believe most companies will still have some employees working from the office full time or going into the office a few days a week. However, the shift towards employees working from home has forced IT to further streamline the infrastructure footprint such as servers and storage within offices.

With employees distributed across home and work locations, it becomes challenging for IT to set up a content repository to efficiently serve both these employee cohorts. Customers traditionally set up storage servers within the office and provide VPN-access for remote employees. However, users connected over VPN frequently complain about poor performance. Some other customers supplement on-premises storage with a cloud-collaboration platform, which leads to data fragmentation across on-premises storage and collaboration cloud platforms.

An intelligent hybrid solution solves this problem by centralizing content and permissions in the cloud, while still caching the files users in the office need. This is where something like Egnyte’s Smart Cache solution can address both the ongoing challenge of quick access to the right files both at home and at the office and reduce administrative overhead of cache device management.

What about companies that have eliminated physical offices and moved to fully remote workforces? This introduces a new set of problems for remote employees: 

  1. How to efficiently access business files from a home environment: some of these files can be significantly large and employees may not have great internet speeds.
  2. Access to software applications such as database systems that are hosted within the customer’s datacenter: traditional technologies such as VPN can be painfully slow to access these systems.

 Virtual offices customers trending to public clouds and still being cost sensitive 

Given these pain points, we are seeing another emerging trend of customers setting up virtual offices on public clouds like AWS and Azure. In this scenario,  the infrastructure is outsourced to cloud file-hosting companies and housed within their data centers. Users can connect to remote terminals and VDI’s on these public clouds and access content and applications similar to how they work while at a physical office. This lift-and-shift of content and applications to public clouds help bridge the gap of increasing storage requirements and eliminating the need for expensive technologies such as VPN to access the office environment. However, it does not address challenges around collaborating with external partners or sharing files with other business units. Many companies continue to use file-sharing solutions in addition to the public clouds. This results in data getting fragmented across public clouds and other cloud platforms.

Many companies have chosen to consolidate their content on the Egnyte cloud and have the flexibility to choose AWS and Azure to host their business applications. Customers centralize their content and associated permissions on Egnyte cloud while their users can open their business applications through remote terminals on AWS and Azure and access the content stored on Egnyte. Furthermore, IT can deploy Egnyte’s Smart Cache solution inside this virtual office to host the most frequently accessed files to reduce network egress between Egnyte and the public cloud.

Point-cloud and automated applications for certain industries (construction, life sciences, media and entertainment, etc.)

For some customers especially in the media, construction, and life sciences industries, it is not only about users and their access requirements to business content and applications. These customers have automated and resource-hungry applications such as media-editing software, construction point-clouds, or DNA sequence analyzers that are hosted in public clouds. These applications require quick access to large data volumes and sometimes require access to specific byte-ranges within a large file. Such file requests that were served by Windows and Linux machines over traditional LAN protocols such as SMB, need to be supported on public clouds. 

Egnyte’s hybrid solution deployed within a public cloud provides the SMB protocol needed for these specialized applications. The solution is also designed to provide access to applications that expect content on the S3 or Azure blob. 

Data collection for life sciences and construction (telemetry, drones, etc.)

For many companies, the public cloud environment does not necessarily reside in silos. Oftentimes, it is where the heavy-lift operations and post-processing of content occurs, while raw data gets generated at a physical location. One example is construction companies that generate large volumes of media files through drones and then upload them to a public cloud for post-processing that data with point-cloud applications. Another example is biotech companies with telemetry devices in their laboratories that generate several GBs of data every hour. This raw data then gets analyzed using advanced applications on AWS or Azure clouds.

How do they upload the raw data generated by machines, telemetry devices, sensors, or drones into these public clouds? That’s often the most challenging aspect of this supply chain. Egnyte’s customers in these industries often deploy Egnyte’s hybrid solution at their regional offices for migrating this data. Field workers download the raw data from the drones into the hybrid device, which gets uploaded to the cloud. Similarly, telemetry devices can transfer the raw data in real-time to hybrid devices in the lab, which then get migrated to the cloud.

Conclusion 

In conclusion, the cloud continues to be an important storage and application platform for most companies in this changing world. However, the market trends point to the fact that “pure” cloud platforms are not enough and need to be bolstered with an edge cache solution. Egnyte’s intelligent hybrid solution, Smart Cache, is a great option for many companies and continues to evolve to meet the challenges of these emerging trends.

Photo by Possessed Photography on Unsplash

Comments are closed.