When you build applications based on Mule EE (Enterprise Edition) and you are using Maven to build your projects you will notice you have dependencies to libraries that are not available in the public Maven repos. To add these libraries to your local maven repo the Mule distribution comes with a script ‘populate_m2_repo’ which is described here how to use it. Continue reading
A quite common use case in a flow in the Mule ESB is validating if an XML document is valid against a corresponding XSD. Especially if a project is in the development/test stage it can be quite annoying to find out you have spent a lot of time fixing an issue when it was actually caused by another system which was supplying invalid XML.
The ‘standard’ way to do this in Mule is by using the schema-validation-filter and especially in combination with the message-filter. For example the following configuration will send all non-valid messages to the ‘process-invalid-xml’ flow:
<message-filter onUnaccepted="process-invalid-xml" throwOnUnaccepted="true"> <xml:schema-validation-filter .../> </message-filter>
Since a few months I am involved in a few Mule ESB projects again. Although it had been a while working with Mule they still know where to find me for some project work with the Mule ESB
At the same time I was contacted by PacktPub to review their latest book about Mule 3: Mule ESB Cook Book.
I was really looking forward to it. Although I have been using Mule for quite some time and in different environments I haven’t had a project that didn’t teach me something new. So I was hoping to pick up some more hidden gems I wasn’t aware of. However, it is not going to be this book that is showing me some hidden gems. I am not sure who would take advantage of the book to be honest. The recipes discuss rather simple use cases in my opinion and mostly show how to work your way through the Mule Studio, which offers a graphical interface to create your Mule flows. If you want a deeper understanding of the Mule ESB this book isn’t for you. Also I don’t expect this book will be of much use if you run into some real world issues because it is simply too simplistic. The good news is that this leaves space for a more advanced Mule cookbook by, for example, by David Dossot
I have posted a while ago how to setup an EMR cluster by using CLI. In this post I will show how to setup the cluster by using the Java SDK for AWS.
The best way to show how to do this with the Java AWS SDK is to show the complete example in my opinion, so lets start.
In my previous post I showed how to setup a complete Maven based project to create a Hadoop job in Java. Of course it wasn’t complete because it is missing the unit test part :-). In this post I show how to add MapReduce unit tests to the project I started previously. For the unit test I make use of the MRUnit framework.
- Add the necessary dependency to the pom
Add the following dependency to the pom:
<dependency> <groupId>org.apache.mrunit</groupId> <artifactId>mrunit</artifactId> <version>1.0.0</version> <classifier>hadoop1</classifier> <scope>test</scope> </dependency>
Although Hadoop Framework itself is created with Java the MapReduce jobs can be written in many different languages. In this post I show how to create a MapReduce job in Java based on a Maven project like any other Java project.
- Prepare the example input
Lets start with a fictional business case. In this case we need a CSV file with English words from a dictionary and all translations in other languages added to it, separated by a ‘|’ symbol. I have based this example on this post. So the job will read dictionaries of different languages and match each English word with a translation in another language. The input dictionaries for the job is taken from here. I downloaded a few files in different languages and put them together in one file (Hadoop is better to process one large file than multiple small ones). My example file can be found here. Continue reading
In most application I build it is very convenient to sent out a mail in certain situations. Last project I had to send out a mail from a JEE application that was deployed at JBoss 7.1. In case of a JEE application it is very common to define the mail service in your container, JBoss 7.1 in this case.
To configure GMail as the mail server in JBoss you have to modify the main configuration file in two places. The main configuration file for a standalone installation is ‘$JBOSS_HOME/standalone/configuration/standalone.xml’ and for a clustered setup ‘$JBOSS_HOME/domain/configuration/domain.xml’. Continue reading
Every now and then I get an update from CloudCheckr about the new features they added to their tool. It is quite impressive what they can achieve for your AWS account(s). If you have a look at the infographic they made earlier this year you can see that almost every company that uses AWS (companies with at least 10 EC2 instances) violates at least one of the best practices they check in the areas ‘cost’, ‘availability’ and ‘security’: Continue reading
Last night I stumbled upon this free online training program where you could learn about how to combine Hadoop and Amazon AWS! You would even receive a certificate if you passed the test at the end of the course. So I started the training by watching the supplied videos and yes, after passing the test I got the certificate:
I recently started to use Liquibase on a project to keep track of the database changes in our Java Enterprise application. I must say that I like the way it works. It makes the deployment of my application (or a new release of it) to another environment easier and (more) fool proof. In the past I had to supply a database script to the DBA which had to be executed right after or before I redeployed my EAR/ WAR file with all issiues that comes with that procedure (script fails/ DBA not available, etc.). Now I won’t say there couldn’t be issues with this solution but from a developer perspective it does make life easier.
Here is how it works for a straight Maven project with some web services that are backed by a MySQL database. Since I deploy the web services in a WAR file on JBoss I chose to have the Liquibase scripts being triggered by a ServletContextListener instance. To be able to test the database script without having to deploy the application I also embedded the Maven plugin for Liquibase in my pom file. This way I can run the liquibase scripts manually with Maven against my local development database.
First add the necessary dependencies to the Maven pom.xml: Continue reading