Customer News

Streamline Design Workflows With Egnyte’s Smart Cache v4.0 and Revit Worksharing

May 30, 2023

Revit worksharing with Egnyte’s Smart Cache v4.0 can help streamline the design process by allowing multiple designers and architects to work together in real-time. This feature improves accuracy, speed, and flexibility, thereby reducing costly miscommunications that tend to slow down project progress.

Challenge

Design collaboration on Revit files can present several challenges for architects and designers alike. While the software offers many tools and functionalities to facilitate teamwork, the sheer complexity of large-scale projects can make collaboration difficult. Managing different versions of the same file, coordinating the work across multiple contributors, and ensuring consistency and accuracy with diverse design elements are just some of the obstacles that can negatively impact project productivity.

Solution

Smart Cache v4.0 with a Windows native drive letter feature is a game-changer for architects and designers who rely on network drives to access their Autodesk Revit files. Collaboration with design teams is streamlined, as a network drive letter can easily access files from laptops allowing individual designers and architects to save precious time and energy. Whether working individually or as part of a team, Smart Cache v4.0 is essential for all architects and designers looking to improve their productivity and efficiency.

For optimal use it's necessary for all Revit users to be in the same physical office for worksharing setup. First, connect the Revit users' network drive letter to Egnyte’s Smart Cache device. After setup, users can create their Revit central models using Smart Cache and work on their local models. The Revit worksharing feature allows seamless communication between the local and central models. Revit-lock requests change to the central model.

Benefits

When working together on Revit projects, Smart Cache v4.0 provides several advantages, including:

  1. Intelligently save only the required Revit files in the cache
  2. Prevent users from unintentionally modifying the central model in the cloud or other office locations
  3. Update file updates only in areas where Smart Cache is installed
  4. Enable real-time synchronization of local models with the Revit central mode

When considering the perfect balance between flexibility and performance, combining Smart Cache v4.0 and single-site Revit tools to offer remote collaboration is the key to success. These tools help improve user productivity and enhance overall project quality, ultimately, enabling design firms to perform at their highest potential.

Read the Full Article 

Move off On-Prem File Servers to Unlock New Capability and Cost Savings

May 3, 2023

While there continues to be (limited) debate about on-premises file servers and cloud file storage, the fight is over, and the cloud has won. If you are still in doubt, take a few minutes to review the limitations and costs of on-premises file servers as well as the benefits of cloud file servers. 

Limitations of On-Premises File Shares

Insecure locations for storage

On-premises file servers are often stored in spare rooms in an office. It is common to find on-premises file servers in coat closets and copy rooms. These lack the protections that are de rigueur for cloud file storage (e.g., physical security, 24×7 monitoring, data encryption, two-factor authentication, intrusion prevention systems).

Heightened risk of data loss

With on-premises file servers, data is stored and backed up on internal servers. In the event of a disaster (e.g., flood, fire, theft) that impacts the facility where the file server resides, data could be permanently lost if the system is damaged or destroyed.  Historically, this risk has been managed by rotating tapes to offsite storage or simply using a cloud repository as a backup. However, these solutions add additional cost.

Security challenges

When on-premises file servers are used for storage, internal IT teams are responsible for security. However, in most cases, these teams lack the security expertise and resources needed to provide protection that is as robust and effective as what is delivered by cloud file server providers. This includes everything from encryption and firewalls to access controls and monitoring systems.

Limited data backup

While some organizations that use on-premises file servers have some form of off-site backup, not all data is included in many cases. Because of costs, only portions of data are saved off-site. This leaves other data at risk if there is a device malfunction, ransomware, or another disaster that damages the on-premises server. 

Inability to scale quickly

On-premises file servers are difficult to scale rapidly, requiring new hardware and IT resources to build, test, and deploy the new systems. This is costly and time-consuming. The resulting delays can negatively affect an organization, inhibiting it from reacting quickly to changing needs.

Difficulty with compliance

Most organizations are subject to some compliance requirements, and even the less stringent ones prove difficult for non-experts to manage. With on-premises file servers, systems must continually be adjusted to stay on top of changing rules. Failing to comply with regulations puts organizations at risk for fines and other penalties. 

Lack of mobility

With an on-premises file server, remote access to your documents and data requires a VPN for secure access, which increases IT complexity and operational costs. In addition, a burst of users trying to access files from the on-premises file server can overwhelm VPN systems and create delays for users who need information.  

Costs Associated with On-Premises File Servers

Setting up an on-premises file-sharing system requires capital investment for the equipment as well as for setting up a space with the appropriate power, climate controls, and security. In addition, special software is required for users to access the on-premises file servers, and VPN capabilities need to be set up for secure connections. 

There are also costs for managing on-premises systems. IT resources are required not just to set these systems up, but also to provide users support and stay on top of updates, repairs, replacements, and manage backups. 

Costs related to on-premises file servers—at a glance
Direct costs
  • Hardware (e.g., physical servers, cables, spare parts)
  • Off-site backups
  • Security tools 
  • Software 
  • Storage 
  • User support 
  • Various licenses
  • Warranties
Indirect or hidden costs
  • Climate control systems
  • Compensation for IT teams
  • Cost of downtime  
  • Depreciation of the hardware and software
  • Disaster recovery systems
  • Powering servers 24/7
  • Set up, configuration, and ongoing upgrade costs for servers and networks
  • Storage space used for the servers

Advantages of Cloud File Storage

Good news! Cloud file storage is a great alternative to on-premises file servers. This purpose-built solution has been designed specifically to solve the problems associated with on-premises file servers. Below are several of the many benefits of using cloud file storage.

Adherence to compliance regulations 

Cloud file storage providers have teams with compliance certifications and expertise in key industries, such as healthcare, government (e.g., federal, state, local, international), finance, education, manufacturing, and pharmaceuticals. Most also provide reporting to support compliance audits.  

Secure access from any device at any location  

Administrators can set up and enforce access controls according to organizations’ policies. Users can access files, based on permissions, from an office desktop, mobile device, or laptop, regardless of location.  

Enhanced security and data breach protection

With cloud file storage, physical and virtual security is provided and managed by a team of experts with access to the latest solutions. Because of economies of scale, cloud file storage providers have teams of cybersecurity professionals dedicated to monitoring and updating all systems to quickly identify and mitigate threats as well as protect against vulnerabilities. Among the many security systems that are provided with a cloud file storage service are:

  • Access control  
  • Application security
  • Continuous validation
  • Data redundancy
  • Encryption for data in transit and at rest
  • Firewalls
  • Intrusion detection
  • Mass file deletion protection 
  • Network security
  • Physical security 
  • Suspicious activity monitoring 
  • Threat monitoring
Managed backup and recovery

Cloud file storage providers have proven systems and processes to manage backup and recovery. This ensures protection from data loss, because they offer built-in redundancy, failover, and automatic backups.

No capital and facilities costs
With cloud file storage, there are no upfront capital expenses. The initial, large investment to purchase and install equipment is eliminated. In addition, the facilities-related costs associated with running on-premises file servers, such as cooling, floor space, and electricity to run the servers, are not required.
Unbeatable scalability

Cloud file storage allows organizations to scale nearly instantly based on changing needs. Scaling can sometimes be automated to optimize resource consumption and costs.  

Cloud File Storage: The Better Choice

Modern organizations should have a file server system that is purpose-built for contemporary requirements. Antiquated on-premises file servers need to be mothballed or used for archives.

Cloud file storage is designed from the ground up to support the requirements of today’s organizations and distributed users. It also gives organizations the agility to adapt to the changing needs of today’s workforce quickly.

Read the Full Article 

Why VPN Is Dead

May 3, 2023

Thanks to the VPN for decades of providing secure remote access to networks. For a time, the VPN was the right technology to protect sensitive information and systems. But, the VPN has come to the end of its life as it was built to support access to hardened perimeters with most systems and applications residing in an organization’s data center. 

Now, enterprises have data stored and systems and applications running inside and outside the traditional perimeters in hybrid and multi-cloud environments. This drives the need to replace the VPN with solutions that are purpose-built to secure data in environments with porous perimeters, expanded attack surfaces, and highly sophisticated threats.

While the VPN had been showing weaknesses for a while, the COVID-19 pandemic brought these deficiencies into high relief. Users accessing enterprise resources were, and continue to be, distributed and remote using multiple devices, including BYOD.

To support the exploding numbers of remote workers, enterprises fell back on the tried-and-true VPN. The results made it clear that the VPN was not up to the task of protecting enterprises’ vastly increased and dynamic attack surface.

Primary Reasons that the VPN Met Its Waterloo

  • Blunt approach to secure access
    The VPN lacks the nuanced access controls required to protect enterprises. It is a binary system. Once a user is authenticated, they are allowed to access the perimeter and are assumed to be trusted. This is a fundamental flaw that violates Zero Trust architecture policies  as it allows an attacker, who gains control of a user’s VPN connection, to access resources inside the perimeter. Even in segmented networks, an attacker could access the resources in a particular segment.
  • Complex with maintenance overhead
    VPN gateways and client software must be supported and maintained, including regular updates and patches. This is increasingly complex as it includes split tunneling, WAN optimizers, and adjacent security appliances.
  • Cumbersome user access
    Users typically have to log in to their device, then the VPN, and then the service they are using, because most enterprises do not integrate the VPN with single sign-on (SSO). In addition, VPN connections can be slowed down due to the physical distance from resources users try to access remotely.
  • Network performance issues
    VPN appliances not only create bottlenecks by adding an extra leg to the path taken by packets, but also do not support services such as QoS and dynamic path selection.

VPN Security Risks

Although VPNs do provide an encrypted connection between two points, they are fraught with gaps that create a number of critical security risks, including the following.  

  • Utilize castle-and-moat-style security, often without traffic inspection, cloud security, or access policies
  • Lack of layered defenses or granular access controls enabling third parties and unauthorized users to have unfettered access once inside the perimeter
  • Have limited capabilities to identify or stop an intrusion  
  • Eliminate visibility and the ability to inspect cloud-bound traffic for potentially malicious content when users connect directly to cloud-based resources (due to VPN’s degradation of network and application performance)
  • Do not protect against threats, such as malware or data exfiltration 
  • Expose enterprises to attacks when VPN software goes unpatched (one of the top three vectors used by cybercriminals for ransomware attacks)  
  • Are not able to enforce policies that protect credentials, allowing users to share credentials, reuse passwords, or use weak passwords
  • Provide little or no granular audit records, making it impossible to monitor and record the actions of VPN users

Additional risks associated with VPNs include:

  • VPN hijacking, where an unauthorized user takes over a VPN connection from a remote client 
  • Weak user authentication
  • Man-in-the-middle attacks, where an attacker intercepts data in transit
  • Malware infection of a client system
  • Split tunneling, where a user is accessing an insecure internet connection while also accessing the VPN connection to a private network
  • Users granted excessive network access privileges
  • DNS leak, where the system accessing a network uses its default DNS connection rather than the VPN’s secure DNS server 

Bury the VPN

It is time to say goodbye to the VPN, which is unable to meet the demands for remote access—neither from a performance and scalability nor a security perspective. VPNs were not designed to meet the security protocols required to defend against current use cases and threats. 

Simply put, the VPN has died, because it cannot provide the nuanced and granular access controls required. The ability to enforce the principle of least privilege as well as to identify and stop unusual user behavior is paramount to supporting the security strategies that enterprises must embrace.

Organizations are varied in their response to the death of the VPN. Some choose to gradually phase out VPNs as they reach the end of life. Others mothball them and adopt new technology that supports the new environments that enterprises have created.  

Regardless of how it is done, enterprises have no choice but to adopt security strategies and solutions that replace the VPN. What is required are processes and technology that reduce complexity and provide protection for a growing attack surface that is being targeted by ever-increasing sophisticated attacks by cybercriminals and malicious insiders. 


Read the Full Article 

Why FTP Is Dead

May 3, 2023

FTP, like fax, is dead. However, many still have not received the message. Just as fax machines do, FTP servers remain online at more companies than would care to admit it. There are still an estimated 21 million FTP servers in use. Despite these numbers, FTP is, without a doubt, dead tech walking. 

FTP’s Fatal Lack of Security 

FTP is inherently not a secure way to transfer data. It was not designed to be a secure protocol, which puts data stored and shared with FTP at extreme risk. Hackers know this and target FTP servers to gain access to sensitive files and folders, often using a simple packet tracer or standard protocol analyzer.

When a file is sent using the FTP protocol, the data, username, password, and commands are shared between the client and server in plain text, leaving it vulnerable to sniffer attacks. Because FTP alone does not provide encryption, hackers can easily intercept transferred data with little to no effort.  

In addition, the FTP protocol uses an outdated user-password scheme for authenticating users to the server. This weak authentication method puts data at risk if credentials are compromised, because unauthorized users can easily access the FTP account.

FTP servers are also subject to common hacker tactics, such as brute force attacks or spoofing attacks. Hackers use brute force attacks to break into the FTP server using a trial-and-error approach, systematically guessing login info, credentials, and encryption keys. The attacker submits combinations of usernames and passwords automatically generated by a tool until they finally find valid credentials. With spoofing attacks, the hacker poses as a legitimate user or device on the network. A common spoofing tactic is a man-in-the-middle attack where a hacker poses as the network and intercepts data transferred with FTP. 

Additional Causes of FTP’s Demise

Cost

FTP requires organizations to purchase and maintain a dedicated file server. In addition, professionals must be hired or retained to set up and administer the system, which has clunky access controls and supports teams using multiple operating systems. This is reported to take up anywhere from 10-20 hours of sysadmins’ time each week, much of which is due to helpdesk tickets.

Difficult to monitor activity

FTP lacks the auditing and activity alerts as well as the granular user roles and file permissions needed to control access effectively. With FTP, tracking what has been uploaded on a remote system or enforcing file-sharing rules and processes is difficult. If files are mishandled, or a data breach occurs, finding the source of the issue becomes a problem.  

Inefficient  

The FTP protocol is slow compared to other modern file transfer protocols, making it less than ideal for quickly sending files over the internet. The system hogs bandwidth when files are uploaded and served, especially when they are large files, such as videos. In addition, FTP server lags occur when too many users upload files at the same time. In this case, users can be blocked from accessing their files.

Lack of support for compliance requirements

Compliance should be a cause of concern when using FTP to send files. This is because FTP’s inadequate security can put organizations at risk of noncompliance fines or worse. If compliance with regulations, such as HIPAA, ITAR, PCI-DSS, SOX, or GLBA, is a requirement, FTP should not be used for file transfers. The security issues noted above make clear the security limitations of FTP for organizations subject to government and industry regulations. 

Poor collaboration 

Using FTP for collaboration is tedious and time-consuming at best. First, users must be set up with usernames, passwords, and settings for the FTP server. Then, access controls on specific folders need to be put into place. After this, there is a poor user experience. Because FTP was not designed for multiple people uploading and downloading files at the same time, connections often time out with partial transfers, requiring users to delete files and resend them. In addition, keeping track of files and when they were modified is a headache. Either the users need to be messaged letting them know that there was a change, or they have to compare file sizes and modified dates to determine if there is a new version of a file. And, there is everyone’s favorite error message, “This file is already in use.”  

Unreliable synchronization

FTP does not provide built-in synchronization between the server and the local directories. Instead, FTP requires a manual process to upload files from one directory on the server to another.

Rest in Peace FTP

Created in 1971 by Abhay Bhushan, a master’s student at MIT, FTP had a good long run. But, into its 50s, FTP needs to be laid to rest. There are generations of file transfer solutions that are full of life and vigor. And, these solutions were purpose-built to take full advantage of modern tech to meet the requirements of today’s organizations.

Read the Full Article 

Why NAS Is Old School

May 3, 2023

In the mid-1990s, NAS devices were increasingly used as the system of choice to share files. This is because NAS was relatively easy to use and removed the burden and load of file serving from other servers on the network.

Users liked NAS, because it simplified sharing files within the organization. Likewise, IT admins were fans, because a single NAS replaced multiple file servers that created data management headaches.

The limitations of NAS came into high relief when users required access and sharing capabilities across multiple devices from remote locations. NAS servers just were not up to the task. And, the VPN access that was put in place to provide secure connections was slow and unreliable.  

Because of this, old-school NAS appliances are being either mothballed and replaced with cloud storage solutions or relegated to the hinterlands and used for cold storage. This is due to their inherent limitations, especially when compared to cloud storage.

Drawbacks of NAS

Despite its capabilities, NAS has significant drawbacks that have led to its demise as a file-sharing solution. Among the limitations of NAS are the following.

  • Creates unacceptable application lag and user experiences due to inadequate performance, when network traffic spikes.
  • Does not have service guarantees to address user issues, such as problems with data lag, missing data, and lost data.
  • Drags down network performance with increased LAN traffic.
  • Has rigid storage capacity that requires predicting utilization and capacity requirements, which, when incorrect, leads to too much or too little, resulting in either performance issues or costly capital expenditures.
  • Increases operational burden and drains IT budgets as admins are required to support and maintain complex NAS systems.
  • Lacks adequate security as NAS subsystems do not include adequate access control and additional security capabilities, such as native data encryption, to meet business and compliance needs. IT teams must add these to NAS deployments as failure to bolster NAS systems with security systems leave organizations’ data at rest and data in motion at risk.  
  • Plagues IT with scalability limitations and complexity.  
  • Puts data at risk in the case of a natural disaster, theft, or fire at the location where on-premises NAS is used.
  • Relies on hard disk drives (HDDs) to serve data, which creates I/O contention when too many users hit the system with simultaneous requests.
  • Results in packets being delayed or sent out of order, making a file unavailable until all packets arrive and are put back in order—due to the limitations of Ethernet transfers. 
  • Suffers from latency in demanding environments with large file transfers, such as video production or sharing multiple large files.

Why Data Is Being Moved from NAS to Cloud Storage En Masse

Data, that was managed with NAS appliances, has moved or is moving to cloud storage at a dizzying pace. The reasons for the success of cloud storage are many and varied, including these. 

  • Built-in backup options make it easy to provide data protection and resiliency.
  • Cloud service providers have the ability to procure and support best-of-breed security solutions  
  • Easy setup of a cloud file server—for simple deployments, a few clicks and a credit card are all it takes.
  • High-availability options are accessible to ensure that users can access files quickly whenever they need without worries about file size.
  • No need for IT support to create and maintain a cloud file server. 
  • Usage-based pricing and capacity provide elasticity to right-size storage based on actual demand. 

There are many more benefits of cloud storage based on different use cases and types of organizations using it. Regardless of how big or small the deployment, cloud storage offers far more than old-school NAS ever can.

Read the Full Article 

Modern, Secure Data Access with Egnyte and Salesforce

March 7, 2023

Every sales and marketing interaction — regardless of where it happens — generates data. Every note written on a salesperson’s computer and every contract or presentation that is uploaded into a CRM system produces valuable signals sales teams use to secure leads and close deals.

Managing and gaining access to all that data, at the critical moment in time when its needed, is not always easy - especially with large volumes of both unstructured and structured data comprising the full history of a particular account or prospect. 

This blog explores some of the challenges associated with managing and accessing large data volumes, implications for revenue organizations, and how Egnyte’s integration with Salesforce is helping companies achieve more cost-effective, modern and secure access to critical account documents.

For many mid-sized organizations, its not uncommon to have millions of rows of data and documents inside their Salesforce deployment. Storage limits can quickly exceed limits, for which organizations are must make the choice between deleting records to free up space or purchasing additional storage.

Additional storage in Salesforce can cost as much as $125 per month for 500MB. For users that require access to this data - even those that are outside of the sales organization and may only need to view documents - Salesforce requires a dedicated license, adding another $25 per user, per month charge. 

In addition to these cost implications, Salesforce’s file management system leaves a lot to be desired. Its search tools and version control features were not built with traditional file systems in mind, making it increasingly difficult for users to find what they need when they need it.  

How does Egnyte Help?

Egnyte has partnered to build a secure API between the Salesforce cloud and the Egnyte cloud, so that account data is securely integrated between the two. This means that Salesforce users can directly access files stored in Egnyte from within the Salesforce application. And the same folders and files are also accessible directly through Egnyte’s UI. By storing those files in Egnyte, businesses gain immediate benefits: 

  • Document storage cost savings. While there is a small integration charge, you’ll save money every month on storage.  Differences exist in pricing models between Salesforce and Egnyte, but its clear that the storage savings (Egnyte currently charges $350 for 1TB at the time of this post), is significant especially for companies with large volumes of data.
  • Wider access footprint. All employees with access to the Egnyte repository also have access to the files, not just those with a Salesforce license. This means that more expensive Salesforce licenses can be reserved for your sales team, while supporting employees from finance, legal, service, and marketing departments can use the lower-cost Egnyte option.
  • Advanced collaboration and sharing. Your sales team can more easily find and collaborate on contracts, share documents securely, and provide links for customers to upload documents. Users get the convenience of accessing files in a familiar way from the desktop as well as  mobile devices for teams in the field.

As a secure file repository, there are additional benefits of storing files in Egnyte: 

  1. Visibility and control over sensitive information 
  2. Automated retention, archival and deletion policies 
  3. Access controls
  4. Workflows for review and approval of documents
  5. Unusual behavior and malware detection 

How It Works

Once the integration has been completed between your Salesforce account and your Egnyte account, your sales team will be able to access files from within Salesforce without leaving the application. At the bottom of a typical customer screen, an Egnyte window appears in the Salesforce interface as shown in the image below.

When setting up a new customer, you can use templates to establish standard folder structures as well as populating them with template files for your sales team use. Sales teams can access and store files for a customer with little to no training. More importantly, the integration allows your sales teams to securely share files with customers using Egnyte file sharing controls, like link expiration, watermarking, notification when opened, specific email recipients, and encryption. This is a valuable capability when sharing sensitive pricing quotes externally. 

For your employees who don’t need access to Salesforce, they can still access the files by going directly to Egnyte. This allows users in other departments of the company to access files to support the sales process. For instance, finance and legal teams can collaborate on quotes, while service and support can review terms and delivery.

The Egnyte and Salesforce integration helps to streamline quote-to-cash workflows, and at the same time, saves money in storage and licensing costs for supporting personnel. With advanced Egnyte search capabilities, documents are at your teams’ fingertips, allowing them to respond more quickly and effectively, increasing customer satisfaction and accelerating revenue capture.

For more information, see our Salesforce Helpdesk Integration article. and contact your Egnyte representative for more information.

Read the Full Article 

Setting Up a New eTMF Study in Egnyte

February 27, 2023

Egnyte’s eTMF allows you to assemble all the critical documentation related to your clinical trial so that you stay on track and audit-ready.  Running your own eTMF gives you full visibility and control over data that is critical to the success of your company.

In this article, you’ll learn how to set up a new study in Egnyte’s eTMF app.

Why Use an eTMF App?

A TMF is really just a collection of documents - office documents, PDFs, emails, and the myriad documents that cover the conduct of your trial.  So why not just store these on a file server? 

Because using a purpose-built eTMF application allows you to stay on track and aligned to industry standards and timelines.  You’ll be able to guide your team to collect the right documents at the right time and you’ll have the tools at your disposal to safely manage those documents, providing secure access to your team, your CRO(s), and auditors.

Setting Up a New Study in Egnyte?

Take the following steps to set up a study in the Egnyte eTMF app.

Step 1: Create the Study

The first step is to create a new study. Egnyte's eTMF app can support multiple studies, each with independent access permissions.  When creating a new study you’ll need to provide a few details.  Some of these are informational only, while others control how the eTMF is initially configured.

  • The Study Name and Study ID will help you to identify the study in the app
  • The Study Type and Investigator-Initiated Study options will determine which TMF artifacts (document types in the TMF Reference Model) will be selected by default for your study.  The reference model has rules of which artifacts are “core” for different study types, and Egnyte builds that logic into the app to get you off to a good start.
  • The Who can manage this level? option lets you specify an Egnyte user group who can be assigned as Study managers.  

Step 2: Configure Required Artifacts

While the app preconfigures required artifacts based on the TMF Reference Model, you may want to modify the configuration to include additional artifacts, remove defaults, or even add custom artifact types.

Open the Artifact Configuration tab in the Study Configuration to choose which artifacts are required at the trial, country, and site-level.  The app shows the reference model recommendations and allows you to filter and search the hundreds of artifacts to make your selections.

You can always come back to Artifact Configuration later if you change your mind on which artifacts are required for your study.

Step 3: Add Countries

The next step is to add the countries in which you will conduct your trial.  You do this by first selecting the Trial in the Configuration screen and opening the Countries tab.

Use the Add a new country button to bring up the new country form and fill it out.  Choose the country from the list and optionally fill out contact information.  Repeat this for each of the countries.

You’ll notice that the study hierarchy is updated with the country codes for your selections.  You can select a country in the hierarchy to update its settings and change permissions.

Step 4: Add Sites

Once your countries are set up, you can add your sites.  The process is very similar to adding countries.  This time you’ll select the country associated with the site in the study hierarchy and navigate to the Sites tab.

Use the Add a new site button to bring up the new site form and fill it out.  Give a name and ID to the site and optionally enter contact information for the site.  Repeat this for each of the sites in your study, making sure you enter the sites in the right country.

The study hierarchy will show each added site nested under its associated country.  You can select the site in the hierarchy to update its settings and change permissions.

Step 5: Activate Your Study

The study, trial, countries, and sites are initially created in an inactive state.  When one of these filing levels is inactive, it isn’t available to your end users and they won’t be able to upload documents or view status.  This gives you an opportunity to set everything up fully before providing access to your users.

To activate the study or any of the filing levels in the study hierarchy, choose that item in the hierarchy and use the dropdown next to its name to change the status to active.  The system will prompt you to provide a 21 CFR Part 11 compliant digital signature that will be tracked in the eTMF audit report. 

Step 6: Activate Filing Level Milestones

The Egnyte eTMF app uses the TMF Reference Model Milestones as a way to track the progress for each filing level and determine which artifacts are needed at a given point in time.  Before you can upload files you need to activate these milestones.  

Choose Milestone Management in the left navigation and then select the relevant filing level.  From here, you can choose a milestone and change its status from Inactive to Active.  As with activating a filing level, you’ll need to provide a digital signature to proceed.

Start Using your eTMF

With just a few clicks, your study hierarchy is set up, artifact configurations defined, and milestones activated, and now you’re ready to start uploading documents to the eTMF App.  Your users will see the lists of required documents for each filing level.  They can upload files, while you track progress, search and view uploaded documents, and view audit history for all changes to ensure that your trial stays on track and inspection ready.

Read the Full Article 

How Barnhill Reduces Document Duplication and Miscommunication by Pairing Egnyte and Procore

January 30, 2023

Tough IT Job #271

Barnhill Contracting Company is spearheading the renovation of the Kinston Regional Airport in Kinston, North Carolina, funded by an impressive $8 million grant. Their site development professionals are undertaking repaving and site renovations to ensure a smooth take-off experience for local travelers. To do that effectively, they turned to Egnyte to ensure that all stakeholders can securely access project documents easily from anywhere, in any tool, and at any time.

The Challenge

Barnhill’s primary challenge was that the majority of their project documents were dispersed between two main systems - Procore and Egnyte.  This created an environment for the potential duplication of documents and laid the groundwork for miscommunication as internal office teams and external project stakeholders often were not always on the same page in terms of field team activities.  In addition, Barnhill relied on a separate tool for punch lists, drawings, and submittals which further complicated matters by introducing an additional document repository with no connectivity between them. 

The Solution

Barnhill turned to Egnyte’s new file sync connector with Procore to solve their problem.  Now, project teams are able to benefit from the convenience of a single source of truth in Egnyte.  By leveraging a standard Egnyte folder structure, team members no longer need to toggle between systems to access the information they need. Field and office teams have effortless access to documents though real-time file synchronization - regardless if located onsite or off.  So whether an executive needs content for review or an onsite manager is collaborating directly within the Procore platform, all users have simple and fast access to what matters most, without missing a beat.

Today, Barnhill Construction trusts the Egnyte and Procore integration to keep their data synced and available for the Kinston Regional Airport Renovation project team. This streamlined process reduces the amount of desktop work required from building division members - all thanks to an intuitive user experience provided by Egnyte. With this solution, the team can access relevant files either through Egnyte or Procore, and get straight back on task with minimal disruption.

Read Barnill's full story here.

Read the Full Article 

How ESA Removes Barriers to Data Access in the Field with Egnyte

January 30, 2023

Tough IT Job #21

ESA’s expert team of architectural historians, historic architects, conservators, and preservation planners are committed to upholding the highest authority in preservation planning.  They assist firms looking to adapt or redevelop a historic site by ensuring that projects adhere to federal, state and municipal regulations in place to protect these important pieces of our collective shared history.

ESA relies on Egnyte to curate and transfer critical field data – information essential for tackling issues such as watershed management, community development, and resiliency planning.

Challenge

ESA's staff struggled with sharing collected data from the field.  Their standard process was manual and time consuming, as field staff would transport data back to the office and upload it into a legacy file system, in order to make the data available to other team members. To make matters worse, over 100+ employees worked on large Computer Aided Design (CAD) files, which due to their size, provided their own set of access challenges. The overall process raised security, continuity, data capture, and integrity concerns. In order to combat these issues ESA needed a modern solution that provided secure and uninterrupted access of critical site information from any location, directly into end-user hands.

Solution

ESA deployed Egnyte and eliminated any worry of data being lost or compromised or just being difficult to access. They now have a single source for production information that is accessible anytime, no matter if personnel are in-office or remote--resulting in less stress on teams across all engineering projects. Tapping into Egnyte’s automated archiving capability, ESA can also store contract and warrantee files securely while allowing for quick retrieval when needed, so operations continue seamlessly without disruption.

With Egnyte’s simple yet secure data access and management solution, ESA is able to remain devotedly focused on preserving invaluable historical artifacts for generations to come.

Read ESA's full story here.

Read the Full Article 

How KAST Construction Optimizes File Access and Sharing During Preconstruction

January 30, 2023

Tough IT Job #128

KAST Construction is building the Reflection Tower, an 18-story condominium tower that also includes 2,800 square feet of ground floor commercial space, in downtown St. Petersburg, Florida. The project required proactive decision-making during the preconstruction phase. With any project of this size and complexity comes the need for careful planning and proactive decision-making – from logistical considerations to keeping stakeholders informed and connected to the latest designs, budget, and schedule. Behind the scenes, Kast Construction leverages Egnyte to provide operational and VDC teams quick and easy access to the latest information and files.

Challenge

KAST’s VDC team operated almost independently of the projects they supported. The team began each project by asking the project manager a series of questions. Then they would go work amongst themselves to produce the models. 

If the VDC team scheduled time to work on a project but lacked all the necessary information to perform their work, they would email the project manager to get answers. Waiting for responses resulted in project delays.

To make matters worse, operational teams executing the projects—the ones who needed the updated model files from the VDC team—couldn’t find the information.

Solution

KAST used Egnyte’s AEC file sharing solution to standardize the project file structure, enabling VDC and Operations teams to work together more cohesively. With Egnyte, the VDC teams know exactly where to find the information they need without having to ask the project manager or wait for an email response. The Operations teams can go to the pre-construction folder to find the latest model files they need.

The standardized folder structure and file update notifications makes it easy to find the right information at the right time on any project. As teams transition between projects, they know what information will be in each folder and how to get it, saving countless hours. 

KAST is building smarter and faster with a little help from Egnyte.

Read KAST's full story here.

Read the Full Article 

How CW Driver Uses Egnyte to Centralize Project Documents and Sync with Autodesk Construction Cloud

January 30, 2023

Tough IT Job #109

Construction projects often involve a multitude of stakeholders and require careful attention to detail. The San Jacinto Campus, Stem Science And Technology Building is no exception. To keep all parties informed on progress updates, C.W Driver utilizes Egnyte’s robust cloud platform to centralize project files and automatically sync Autodesk Construction Cloud documents to ensure each stakeholder stays apprised of construction activities at all times.

The Challenge

Over time, C.W. Driver took an organic approach to building its IT infrastructure — adding storage capacity and file-sharing tools as needed. Multiple technologies were adopted to support file sharing and management within the business.  This made it very difficult to ensure that timely data backups were happening, while sprawling data across cloud and on-premises sources created workflow problems for project teams. 

The Solution

To optimize file access and minimize data sprawl on the San Jacinto Campus, Stem Science And Technology Building project, C.W Driver turned to Egnyte’s cloud file solution. Its centralized repository and user-friendly interface for file collaboration provided a familiar mapped-drive format that employees had become comfortable using.  

The Autodesk Construction Cloud integration with Egnyte syncs project data and extends file access and visibility even further — by providing the ability to share project data interchangeably across multiple devices and quickly sync changes for up-to-date project information anytime, in anyplace. Project teams now can provide consistent access to the latest files to all stakeholders. 

Today, the San Jacinto Campus, Stem Science And Technology Building project team taps into a single source of truth for the latest drawings and RFI documents in each project’s official Egnyte master folder.  Internal teams in the office and the field, along with project stakeholders, can work confidently off the latest information and accelerate the project’s progress. 

With solutions like Egnyte, C.W. Driver is building an IT environment that will keep them running efficiently and productively for the next 100 years.

Read C.W. Driver's full story here.

Read the Full Article 

Top Considerations for Building a Lab-to-Cloud Workflow

October 28, 2022

Since March 2020, cloud adoption has accelerated at an unprecedented rate and across every industry. With the pandemic ushering in the work-from-home era, the ability of organizations to collaborate remotely has become paramount, placing a higher-than-ever premium on cloud technology. 

This shift to the cloud has taken different forms for different industries. For life science organizations, one example is the enormous volumes of clinical data generated in lab settings that need to be shared globally to cloud-based repositories, and made accessible to myriad remote stakeholders. Any organization hoping to extract immediate value from this data is tasked with building a lab-to-cloud data workflow that is scalable, has integrity, and allows for granular control and oversight of valuable clinical data.

Here are some critical considerations for organizations building lab-to-cloud workflows.

Have You Established the Scope of Your Data Validation?

Clinical research today involves mountains of data from a wide variety of sources. Much of that data must be validated when uploaded to the cloud. But some of it does not – and it’s up to each organization to determine what data to include in their validation efforts. 

On the one hand, if an organization fails to validate enough of its data, they risk compliance violations and compromised data integrity. On the other hand, validating data that doesn’t need it constitutes a highly inefficient use of resources, which can slow progress and extend trial timelines. 

How will you set parameters around what data does and doesn’t need validation? How will your lab-to-cloud workflow provide visibility to separate the two categories easily? These are critical questions for organizations to answer as they move from paper-based or on-premise electronic data management to the cloud.

Are All of Your Teams on the Same Page?

Building an effective lab-to-cloud workflow is a multi-team process. But distrust among teams at life science organizations is a pervasive issue that can seriously undermine progress toward smooth adoption. 

As just one example: Many life science organizations have seen tensions build between their IT and QA teams when creating their lab-to-cloud workflows. IT accuses QA of not fully appreciating cloud technology's complexities; QA responds by accusing IT of failing to grasp the breadth and nuance of compliance considerations. This tension produces negative downstream outcomes for the entire organization, from regulatory risks, extended trial timelines, and poor data integrity. 

Before making concrete decisions about how to approach cloud adoption – what vendors to partner with, what workflows to implement – ask yourself: Are my IT and QA teams on the same page and working toward a common goal? If not, what do I need to do to change that?

Are You On Top of Data Control Issues?

Central to data integrity in the cloud computing era are two fundamental questions:

  1. Who can access and control the content you’re storing in the cloud?
  2. Where are you storing your data, and how does that change based on factors such as its type or access permissions?

These issues are often overlooked but will only become more pressing as cloud adoption progresses across the life sciences. To address them, organizations must ensure that their lab-to-cloud workflows include concrete standards around data access permissions and storage practices. At stake is their ability to uphold data integrity and compliance when collaborating remotely.

Building Your Lab-to-Cloud Workflow

This article has posed several questions organizations must consider when building their lab-to-cloud workflows. Nathan McBride, former Vice President of IT at Affinivax, holds many of the keys to answering those questions. 

During Egnyte’s recent Life Sciences Summit, Nathan shared a scalable, detailed, multi-level model for establishing data validation protocols, zeroing in on data that needs validation, and ensuring data integrity – all while strengthening data security – as you shift critical data workflows to the cloud. 

Watch the recorded session for an up-close look at Nathan’s powerful model – and for concrete, actionable guidance on how you can build your lab-to-cloud workflow.

Read the Full Article 

We’re Building a New CRM Integration with Microsoft Dynamics 365 - Beta Coming Soon

January 10, 2020

Customers are the cornerstone of any company, which is why customer relationship management (CRM) tools have become essential for today’s digital enterprise. With this in mind, Egnyte is building out integration with Microsoft Dynamics 365 as part of our ongoing efforts to extend our CRM integrations.

Egnyte’s Dynamics 365 integration will soon be available in beta mode and is part of our ongoing efforts to align with Microsoft tools. We currently offer integrations with Microsoft’s Office 365, Teams, Power Automate, and SharePoint.

Egnyte for Dynamics 365 provides a central location to manage and organize all CRM-related documents and records. Egnyte users can upload, access, and share CRM files and folders directly from their Egnyte web interface, which is embedded within Dynamics 365 accounts. This eliminates the challenges of having multiple documents in different places and allows for easy collaboration for those inside and outside the organization.

This integration not only allows for easy content management, but enhanced security and increased storage capacity. It also helps companies save money. One of the reasons Egnyte built integration with Salesforce CRM years ago was that Salesforce storage costs were often redundant due to Engyte’s storage offerings.

With Egnyte and Dynamics 365, organizations can:

  • Give sales teams anytime, anywhere access to the files they need through the Egnyte Web interface, which is embedded within the Dynamics 365 environment.
  • Enhance collaboration by sharing Egnyte files and folders from Dynamics 365, even with those who don’t have a Dynamics 365 license.
  • Easily organize client files with Egnyte folders created for each Dynamics 365 lead, opportunity, and customer record.
  • And more

Our Dynamics 365 integration is yet another step toward helping our customers build a powerful, collaborative, and secure platform, designed specifically for their business.


To learn more about the Egnyte and Microsoft partnership, please click here.  

Check back to see when you can try out Microsoft Dynamics.  

Read the Full Article 

Welcome to
Egnyte Blog

Company News
Product Updates
Life at Egnyte
Industry Insights
Use Cases