Big Data Workload and Data Migration With Enterprise Data Backup | MLens




Compute Workload Migration, Cloud Data Management and more.
MLens is a one-stop solution for all your Big Data needs, from Automated Disaster Recovery to Compute Workload Migration, and everything in between.
Request demo Download Brochure Migrate to Databricks on Azure or AWS MLens Feature Comparison

Click to read more about it
Click to read more about it
Click to read more about it
Click to read more about it
Click to read more about it
Click to read more about it
Click to read more about it


Demo Image

Workload Migration to the Serverless Data Lake or Enterprise Data Lake 3.0


Automate your big data workload migration from either cloud or on-prem to Databricks on Azure or AWS.

Enable transparent accessibility of migrated assets so developers can review and make necessary amendments.

Configurable with assessment report in each stage of migration

Automated migration of big data applications including complex Hive Queries & Oozie workflows to Databricks environment on AWS or Azure.

Download brochure

Big Data Backup


Identifies changes to source tables/directories in real time

Migrates incremental changes to your client endpoints

Supports compression and distributed data transfer to/from Cloud Stores

Provides detailed auditing and logging

Endpoint connectors available for Hadoop, HBase, RDBMS, Teradata and more

Supports in built compression and file format conversion

Restart from last synch point in-case of any connectivity failures

Demo Image
Demo Image

Automated Disaster Recovery


Automatically restore full cluster including infrastructure provisioning, configurations, Network topology, Kerberos Security etc. during Disaster Recovery

Full support for installation and restore of Sentry, Hive, Oozie, Airflow, Impala, ACLs, HDFS Snapshot etc during Disaster Recovery

Full Backup and restore of Hive Metastore and any custom RDBMS schema and databases

Support Full and Incremental Backup for cluster configurations, topology and roles of the existing Cluster

Support Disaster Recovery on On-premise or Cloud infrastructure

Seamless Disaster Recovery on Kerberos Cluster

Data Encryption, compression & Archival


Compress and backup data across firewalled platfoms and clusters

Configurable compression formats

Archive compressed data in Cloud storage

Target Format can be Parquet/AVRO/Text

High Speed Transfer

Utility to merge small files

Archive to S3/Azure Blob/ Glacier etc.

Demo Image
Demo Image

High Speed Batch Data Ingestion


Performs high speed parallel data processing

Supports FTP/SFTP, cloud storages like S3/Azure Blob as well as standard RDBMS

Supports compression and file format conversion during ingestion

Supports configuration driven transformation during ingestion

Supports capability to merge small files on the fly

Supports parallel data ingestion without landing zones

Monitoring & Scheduling


View live progress of the Backup and Recovery Processing through the Job Monitoring Console

Manage the Scheduling and Configuration of the Backup and Restore Process

View progress across clusters in an Enterprise in a single window

View and debug any errors or failure

Real Time Notifications for any Job Failure

Notification Emails for Job completion and Job Status Reports

Demo Image
Demo Image

Secured Access controls


Active Directory/LDAP/Kerberos based authentication for secured data access

Encryption Support for data in transit and at Rest

Selective Data backup and Restore based on security needs

Isolation and support for Encryption Zone

Secured authentication for all endpoints and sources using integrated authentication



"With MLens, users will experience automation right from the assessment phase to the migration phase, reducing time and effort by over 70% in the entire Big Data Migration journey."

You're in good company.


Click to read more about it
Click to read more about it
Click to read more about it
Click to read more about it