Cloud computing is the way to the future, and the way to bring your company to the next level. With the abillity to have enterprise grade services and technologies at a significantly lower price, your company can focus on creating more value while your IT department has to spend less time on maintaining infrastructure.
These are our top five reasons to move your BI infrastructure to the cloud:
What is Amazon DMS
Every day, more and more companies are moving towards cloud computing, with Amazon Web Services (AWS) undoubtedly being the biggest player. Having all the possible AWS services available at your fingertips is great, but you still need to migrate your existing infrastructure and data into the (AWS) cloud. At re:Invent 2015, Amazon announced “AWS Database Migration Service”, aiming to make the process of moving data into databases on AWS a lot easier.
AWS DMS supports most open-source and commercial databases such as PostgreSQL, MySQL, MariaDB, Oracle, Microsoft SQL Server, and of course their own Aurora, Redshift, DynamoDB and S3 services. Both homogeneous (e.g. Postgres to Postgres) and heterogeneous migrations (e.g. Oracle to MySQL) are supported. Either the source or target database is required to be in the AWS cloud. DMS regularly gets updated with new features and supported engines.
At the highest level, you have three components to take care of when starting a migration using DMS:
My views on BI after one year in the trenches
After having worked for about a year as a Business Intelligence consultant, I’d like to explain my views on the subject. At know.bi, we mainly work with the commercial open source BI platform Pentaho, so I’ll use that as a reference, but this post should apply to BI in general and is not meant to be limited to any given platform.
Pentaho Report Designer - Multi-sheet Excel Reports
When you’re designing a report with Pentaho Report Designer (PRD) you may want it to contain sheets like Excel does. For instance we want interactive sheets and the ability to correctly export to Excel. Sadly this isn’t the easiest task, but here’s one of the ways you can get it to work.
3 reasons to move your ETL to the web, cloud
ETL development heavily relies on the desktop with files, database and network connections that require the developer to be the resources that are located in the company network.
Apart from these access restrictions, most of the established ETL platforms have a history of over a decade and were originally developed in an era where web based applications were basic at best.
Times have changed, however, and web applications have come a long way. We'll look at a number of reasons to move your ETL to the web and/or cloud.
1. Data can't leave the organization
There are plenty of cases where data is considered to be too sensitive to leave the organization's premises or (virtual) private cloud.
With a centralized ETL infrastructure, ETL developers and data engineers can work from anywhere in the world. All of the data is managed over secure connections without the need for a single byte of data to leave the organization's systems.
2. Data is too big to copy or changes frequently
ETL developers and data engineers often need to work in geographically separate locations, while the data remains in one location.
Developing ETL or working with frequently changing data over VPN connections and remote deskop protocols is painful, if possible at all.
Life can be a lot easier if the ETL and data management work can be done over a standard HTTP(S) protocol from anywhere in the world.
3. Simplified installation, configuration and project management
Last but not least, ETL configuration management and overall DevOps for a large number of desktop installations can be a burden.
Instead of maintining an installation on every ETL developer's or data engineer's machine, a centralized approach can significantly simplify the process.
With a centralized installation, developers are guaranteed to work on the same standardized software version, configuration and set of plugins.
Additionally, ETL working practices and conventions are a lot easier to enforce from a centralized environment.
Try it out for yourself
If you're using or considering Pentaho (now part of Hitachi Vantara), all of this is within grasp: with the WebSpoon project, your existing ETL can simply be moved to the web and cloud. No changes to your existing code base are required, and you can gradually (or partially) make the switch to web or cloud based ETL.
We've set up a demo environment for WebSpoon, feel free to give it a try.
WebSpoon is available as open source and is not (yet) part of the Pentaho Enterprise Edition. Let us know if you'd like to find out how we can help you bridge the gap.
Disclaimer: the use cases and images in this post were taken from WebSpoon author Hiromu Hota's presentation.
PCM17 - Business Use Cases Room
Read our overview of the Keynotes
Read our overview of the talks in the Technical room
Using a BI tool to improve the management of health data in Mozambique - Devan Manharlal
Devan kicked off the talks in the business room by sharing his experiences in building a health data in Mozambique.
Part of the scope of the project was the geographical allocation of e.g. nurses over Mozambique, which poses some specific challenges in a developing country like Mozambique.
The reasons Pentaho was chosen are mainly because the need for
PCM17 - Technical Room
Read our overview of the Keynotes
Read our overview of the talks in the Business room
Data Pipelines - Running PDI on AWS Lambda - Dan Keeley
Dan explained how serverless PDI allows to spend time on the solution rather than getting the PID server and infrastructure up and running.
Although virtualization already takes some of the infrastructure management pain away, there's still quite a bit of overhead involved, whereas no infrastructure management is needed when running In the cloud.