Sunday, 27 October 2013

Running Hadoop locally without installation

If we want to take Hadoop for a test-drive without installing the whole distribution, we can do it quite easily.

First of all, let's create a maven project with the following dependencies:

<dependency>
<groupId>org.apache.mahout.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>0.20.1</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.0.13</version>
</dependency>
<dependency>
<groupId>commons-httpclient</groupId>
<artifactId>commons-httpclient</artifactId>
<version>3.1</version>
</dependency>
There is a known issue with running newer version on Windows, so the older one is chosen.
Cygwin is also required to be installed when running on Windows.

We will create a job to count the words in files (it's a well-known example taken from the official tutorial).

Our mapper would look like:

public class WordCountMapper extends Mapper<LongWritable, Text, Text, IntWritable> {
private final static IntWritable ONE = new IntWritable(1);
private Text word = new Text();
@Override
public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
String line = value.toString();
StringTokenizer tokenizer = new StringTokenizer(line);
while (tokenizer.hasMoreTokens()) {
word.set(tokenizer.nextToken());
context.write(word, ONE);
}
}
}
The mapper splits the lines from file into words and pass on each word as the key with the value of one.

Here comes the reducer:

public class WordCountReducer extends Reducer<Text, IntWritable, Text, IntWritable> {
@Override
public void reduce(Text key, Iterable<IntWritable> values, Context context) throws IOException, InterruptedException {
int sum = 0;
for (IntWritable val : values) {
sum += val.get();
}
context.write(key, new IntWritable(sum));
}
}
The reducer receives all values for given key and counts them.

All that is left is a main class that will run the job:

public class App {
public static void main(String[] args) throws Exception {
Job job = new Job();
job.setJarByClass(App.class);
FileInputFormat.addInputPath(job, new Path("src/main/resources"));
FileOutputFormat.setOutputPath(job, new Path("results/output-" + System.currentTimeMillis()));
job.setMapperClass(WordCountMapper.class);
job.setReducerClass(WordCountReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
boolean result = job.waitForCompletion(true);
System.exit(result ? 0 : 1);
}
}
view raw App.java hosted with ❤ by GitHub
We are setting job's mapper, reducer and classer for key and value.
Input and output paths are set as well.
You can run it directly and check the output file with the result.
The whole project can be found on github.

Saturday, 21 September 2013

Transaction management with Spring Data JPA

Spring provides an easy way to manage transactions.
Let's see how to make our methods transactional without using any XML configuration.

We will start with Spring Java config for our application:

@EnableJpaRepositories
@EnableTransactionManagement
@ComponentScan("pl.mjedynak")
@Configuration
public class AppConfig {
@Bean
public DataSource dataSource() {
return new EmbeddedDatabaseBuilder().setType(EmbeddedDatabaseType.HSQL).build();
}
@Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean entityManagerFactory = new LocalContainerEntityManagerFactoryBean();
entityManagerFactory.setDataSource(dataSource());
entityManagerFactory.setJpaVendorAdapter(jpaVendorAdapter());
entityManagerFactory.setPackagesToScan("pl.mjedynak.model");
return entityManagerFactory;
}
@Bean
public JpaVendorAdapter jpaVendorAdapter() {
HibernateJpaVendorAdapter hibernateJpaVendorAdapter = new HibernateJpaVendorAdapter();
hibernateJpaVendorAdapter.setShowSql(false);
hibernateJpaVendorAdapter.setGenerateDdl(true);
hibernateJpaVendorAdapter.setDatabase(Database.HSQL);
return hibernateJpaVendorAdapter;
}
@Bean
public PlatformTransactionManager transactionManager(EntityManagerFactory entityManagerFactory) {
return new JpaTransactionManager(entityManagerFactory);
}
}
view raw AppConfig.java hosted with ❤ by GitHub
We have an entity representing a bank account:

@Entity
public class Account {
@Id
@GeneratedValue
private Long id;
private BigDecimal balance;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public BigDecimal getBalance() {
return balance;
}
public void setBalance(BigDecimal balance) {
this.balance = balance;
}
@Override
public int hashCode() {
return Objects.hash(id, balance);
}
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || getClass() != obj.getClass()) {
return false;
}
final Account other = (Account) obj;
return Objects.equals(this.id, other.id) && Objects.equals(this.balance, other.balance);
}
@Override
public String toString() {
return "Account{" +
"id=" + id +
", balance=" + balance +
'}';
}
}
view raw Account.java hosted with ❤ by GitHub
We also have a repository from Spring Data JPA for Account objects:

public interface AccountRepository extends CrudRepository<Account, Long> {
}

The TransferService allows transferring money from one account to another:

@Component
public class TransferService {
@Autowired private AccountRepository accountRepository;
@Transactional
public void transfer(Account from, Account to, BigDecimal amount) {
BigDecimal currentFromBalance = from.getBalance();
BigDecimal currentToBalance = to.getBalance();
to.setBalance(currentToBalance.add(amount));
accountRepository.save(to);
if (currentFromBalance.compareTo(amount) < 0) {
throw new IllegalStateException("not enough money");
}
from.setBalance(currentFromBalance.subtract(amount));
accountRepository.save(from);
}
}
Just for example sake, we are adding money to one account and before subtracting from the second one, we check if it has enough funds.
If the method wasn't transactional, we would introduce a major bug.
However, @Transactional annotation makes it eligible for rollback if the exception is thrown.
It's also important to note that many methods from the repository are transactional with default propagation, so the transaction from our service will be reused.
Let's make sure that it works by writing an integration test:

@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(classes = AppConfig.class)
public class TransferServiceIntegrationTest {
@Autowired private AccountRepository accountRepository;
@Autowired private TransferService transferService;
@Test
public void shouldRollbackTransactionWhenExceptionIsThrown() {
// given
Account from = anAccount().withBalance(valueOf(100)).build();
Account to = anAccount().withBalance(valueOf(0)).build();
accountRepository.save(asList(from, to));
BigDecimal amount = valueOf(200);
// when
try {
transferService.transfer(from, to, amount);
} catch (IllegalStateException e) {
// then
Account returnedFrom = accountRepository.findOne(from.getId());
Account returnedTo = accountRepository.findOne(to.getId());
assertThat(returnedFrom.getBalance().doubleValue(), is(100d));
assertThat(returnedTo.getBalance().doubleValue(), is(0d));
return;
}
fail();
}
}
If we remove @Transactional annotation the test will fail.
Be aware that managing transactions with Spring have some traps that are described in this article.
The whole project can be found at github.

Thursday, 29 August 2013

Changing application behavior at runtime with JMX

Sometimes we need to be able to change the behavior of our application without a restart.
JMX, apart from its monitoring capabilities, is a perfect solution for this.
Spring provides great JMX that will ease our task.

Let's start with a simple service, which behavior we will change at runtime.

public class DiscountService {
private AtomicInteger globalDiscount = new AtomicInteger(12);
public int calculateDiscount() {
return globalDiscount.get() * 2;
}
}
DiscountService calculates discount based on a globalDiscount - it's value is harcoded for simplicity purposes, it would probably be read from some configuration file or database in more realistic example.

First of all, in order to expose methods to manage globalDiscount we need to add @ManagedResource annotation to our class and add the methods with @ManagedOperation annotation.
We could also use @ManagedAttribute if we would treat these methods as simple getter and setter for globalDiscount.

Class with needed methods and annotations would look like:

@ManagedResource
public class DiscountService {
private AtomicInteger globalDiscount = new AtomicInteger(12);
public int calculateDiscount() {
return globalDiscount.get() * 2;
}
@ManagedOperation
public int checkGlobalDiscount() {
return globalDiscount.get();
}
@ManagedOperation
public void modifyGlobalDiscount(int newDiscount) {
globalDiscount.set(newDiscount);
}
}

In Spring configuration we just need to define the bean for DiscountService and enabling exporting MBeans with MBean server.

@Configuration
@EnableMBeanExport
@ComponentScan("pl.mjedynak")
public class AppConfig {
@Bean
public DiscountService discountService() {
return new DiscountService();
}
@Bean
public MBeanServerFactoryBean mbeanServer() {
return new MBeanServerFactoryBean();
}
}
view raw AppConfig.java hosted with ❤ by GitHub

We can run the application with:

public class App {
public static void main(String[] args) throws IOException {
new AnnotationConfigApplicationContext(AppConfig.class);
System.in.read();
}
}
view raw App.java hosted with ❤ by GitHub

Now we're ready to manage our service with jconsole:


Our service is exposed locally, but if we want to be able to connect to it remotely, we will need to add following beans to the Spring configuration:

@Bean
public RmiRegistryFactoryBean registry() {
return new RmiRegistryFactoryBean();
}
@Bean
@DependsOn("registry")
public ConnectorServerFactoryBean connectorServer() throws MalformedObjectNameException {
ConnectorServerFactoryBean connectorServerFactoryBean = new ConnectorServerFactoryBean();
connectorServerFactoryBean.setObjectName("connector:name=rmi");
connectorServerFactoryBean.setServiceUrl("service:jmx:rmi://localhost/jndi/rmi://localhost:1099/connector");
return connectorServerFactoryBean;
}
view raw beans.java hosted with ❤ by GitHub

Service is now exposed via RMI.
We can invoke the exposed methods programmatically, which allows us to write some scripts and manage services without using jconsole.

Let's write an integration test to check that it works correctly.
We will need a Spring config for the test with the RMI client.

@Configuration
public class AppJmxIntegrationTestConfig {
@Bean
public MBeanServerConnectionFactoryBean clientConnector() throws MalformedURLException {
MBeanServerConnectionFactoryBean mBeanServerConnectionFactoryBean = new MBeanServerConnectionFactoryBean();
mBeanServerConnectionFactoryBean.setServiceUrl("service:jmx:rmi://localhost/jndi/rmi://localhost:1099/connector");
return mBeanServerConnectionFactoryBean;
}
}

The test will increment the value of globalDiscount.
Take a closer look at exposed methods invocation, which is very cumbersome, especially if the method has parameters.

@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(classes = {AppJmxIntegrationTestConfig.class})
public class AppJmxIntegrationTest {
private static final String CHECK_GLOBAL_DISCOUNT_METHOD_NAME = "checkGlobalDiscount";
private static final String MODIFY_GLOBAL_DISCOUNT_METHOD_NAME = "modifyGlobalDiscount";
@Autowired
private MBeanServerConnectionFactoryBean clientConnector;
@Test
public void shouldInvokeOperations() throws Exception {
// given
MBeanServerConnection connection = clientConnector.getObject();
ObjectName objectName = new ObjectName("pl.mjedynak:name=discountService,type=DiscountService");
// when
Integer oldDiscount = (Integer) connection.invoke(objectName, CHECK_GLOBAL_DISCOUNT_METHOD_NAME, null, null);
Integer newDiscount = ++oldDiscount;
connection.invoke(objectName, MODIFY_GLOBAL_DISCOUNT_METHOD_NAME, new Object[]{newDiscount}, new String[]{int.class.getName()});
Integer currentDiscount = (Integer) connection.invoke(objectName, CHECK_GLOBAL_DISCOUNT_METHOD_NAME, null, null);
// then
assertThat(currentDiscount, is(newDiscount));
}
}

The whole project can be found at github.

Sunday, 21 July 2013

Caching with Spring Cache

From time to time we need to use a cache in our application.
Instead of writing it on our own and reinventing the wheel, we can use Spring Cache.
Its biggest advantage is unobtrusiveness - besides few annotations we keep our code intact.

Let's see how to do it.
At first let's have a look at some fake service that is the reason for using cache (in real life it may be some database call, web service etc.)

public class SlowService {
private Logger logger = LoggerFactory.getLogger(SlowService.class.getName());
public boolean isVipClient(String clientName) {
logger.debug("Checking " + clientName);
boolean result = false;
if (clientName.hashCode() % 2 == 0) {
result = true;
}
return result;
}
}

We will cache the invocations of isVipClient(...) 

Let's start with adding dependencies to our project:

<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.0.13</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>3.2.3.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>3.2.3.RELEASE</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.kubek2k</groupId>
<artifactId>springockito</artifactId>
<version>1.0.5</version>
<scope>test</scope>
</dependency>
view raw dep.xml hosted with ❤ by GitHub

We'd like to use xml-free spring config, so we will define our beans in java:

@Configuration
@EnableCaching
@ComponentScan("pl.mjedynak")
public class AppConfig {
public static final String VIP_CLIENTS_CACHE = "vipClients";
@Bean
public SlowService slowService() {
return new SlowService();
}
@Bean
public CacheManager cacheManager() {
SimpleCacheManager cacheManager = new SimpleCacheManager();
cacheManager.setCaches(Arrays.asList(new ConcurrentMapCache(VIP_CLIENTS_CACHE)));
return cacheManager;
}
}
view raw AppConfig.java hosted with ❤ by GitHub

We have our SlowService defined as a bean. We also have the cacheManager along with a cache name and its implementation - in our case it's based on ConcurrentHashMap. We can't forget about enabling caching via annotation.

The only thing to do is to add annotation @Cacheable with a cache name to our method:

@Cacheable(AppConfig.VIP_CLIENTS_CACHE)
public boolean isVipClient(String clientName) { ...
view raw exc.java hosted with ❤ by GitHub


How to test our cache?
One way to do it is to check if the double invocation of our method with the same parameter will actually execute the method only once.
We will use springockito to inject a spy into our class and verify its behaviour.
We need to start with a spring test context:

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:mockito="http://www.mockito.org/spring/mockito"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.1.xsd
http://www.springframework.org/schema/context
http://www.springframework.org/schema/context/spring-context.xsd
http://www.mockito.org/spring/mockito classpath:spring/mockito.xsd">
<context:component-scan base-package="pl.mjedynak"/>
<mockito:spy beanName="slowService"/>
</beans>

Basically the context will read the configuration from java class and will replace the slowService bean with a spy. It would be nicer to do it with an annotation but at the time of this writing the @WrapWithSpy doesn't work (https://bitbucket.org/kubek2k/springockito/issue/33).

Here's our test:
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
public class SlowServiceIntegrationTest {
@Autowired
private SlowService slowService;
@Test
public void shouldUseCacheWhenCallingCachedMethodWithTheSameParameter() {
// given
String clientName = "WakaWaka";
// when
slowService.isVipClient(clientName);
slowService.isVipClient(clientName);
// then
verify(slowService).isVipClient(clientName);
}
}

This was the very basic example, spring cache offers more advanced features like cache eviction and update.

The whole project can be found at github.


Saturday, 29 June 2013

Handling command line arguments with args4j

From time to time, we need to write a tool that is using command line arguments as input.
Having an interface similar to unix command line tools is not a trivial task but with args4j it becomes quite easy.

We're going to write a fake tool called FileCompressor which process of invocation would look like:

compressor -i inputFile.txt -o outputFile.txt -p high
view raw invocation hosted with ❤ by GitHub
where -i is input file name, -o output file name and -p the priority of the process.

Let's start with defining the dependencies of our project:

<dependency>
<groupId>args4j</groupId>
<artifactId>args4j</artifactId>
<version>2.0.23</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>3.2.2.RELEASE</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<version>1.9.5</version>
<scope>test</scope>
</dependency>
view raw dep.xml hosted with ❤ by GitHub
We'll need args4j, the rest is for testing.

Our fake FileCompressor requires a configuration that will be populated by args4j.

public class FileCompressor {
public void compressFile(Configuration configuration) {
System.out.println("Received: " + configuration);
}
}
The configuration object has fields that will be mapped to command line arguments with args4j annotations.

public class Configuration {
private enum Priority {
HIGH, LOW;
}
@Option(name = "-i", usage = "input file name", required = true)
private String inputFileName;
@Option(name = "-o", usage = "output file name", required = true)
private String outputFileName;
@Option(name = "-p", usage = "process priority")
private Priority priority = Priority.HIGH;
public String getInputFileName() {
return inputFileName;
}
public String getOutputFileName() {
return outputFileName;
}
public Priority getPriority() {
return priority;
}
@Override
public String toString() {
return "Configuration{" +
"inputFileName='" + inputFileName + '\'' +
", outputFileName='" + outputFileName + '\'' +
", priority=" + priority +
'}';
}
}
And here is the class that will be invoked:

public class App {
private static Configuration configuration = new Configuration();
private static FileCompressor fileCompressor = new FileCompressor();
private static CmdLineParser parser = new CmdLineParser(configuration);
private static PrintStream outputStream = System.out;
public static void main(String[] args) {
try {
parser.parseArgument(args);
fileCompressor.compressFile(configuration);
} catch (CmdLineException e) {
outputStream.println(e.getMessage());
parser.printUsage(outputStream);
}
}
}
view raw App.java hosted with ❤ by GitHub
First of all, the arguments are parsed and configuration object is populated with them.
If there is a parsing problem, the exception is printed along with the usage (args4j can automatically print the usage of our application based on used annotations).

Testing our application requires some effort.
To begin with, we would like to mock completely the FileCompressor. There is no setter for it so we'll use Spring's ReflectionTestUtils together with Mockito.

@RunWith(MockitoJUnitRunner.class)
public class AppTest {
private static final String INPUT_FILE = "input.txt";
private static final String OUTPUT_FILE = "output.txt";
private static final String INVALID_ARGUMENT = "-www";
@Mock
private FileCompressor fileCompressor;
private App app = new App();
@Before
public void setUp() {
setField(app, "fileCompressor", fileCompressor);
}
view raw AppTest.java hosted with ❤ by GitHub
We've also prepared some test data.

Let's check if the configuration object is populated properly and if the collabolator was invoked:

@Test
public void shouldPopulateConfiguration() {
// when
app.main((String[]) asList("-i", INPUT_FILE, "-o", OUTPUT_FILE).toArray());
// then
Configuration configuration = (Configuration) getField(app, "configuration");
assertThat(configuration.getInputFileName(), is(INPUT_FILE));
assertThat(configuration.getOutputFileName(), is(OUTPUT_FILE));
}
@Test
public void shouldInvokeFileCompressor() {
// when
app.main((String[]) asList("-i", INPUT_FILE, "-o", OUTPUT_FILE).toArray());
// then
verify(fileCompressor).compressFile(isA(Configuration.class));
}
view raw AppTest1.java hosted with ❤ by GitHub
We would also like to know if the usage was printed when invalid arguments were passed.
To do that we need to use a spy, as we want to have the original behavior of populating argument, but we want to verify if printUsage method was invoked.

@Test
public void shouldPrintUsageWhenInvalidArguments() {
// given
Configuration configuration = (Configuration) getField(app, "configuration");
CmdLineParser parser = spy(new CmdLineParser(configuration));
setField(app, "parser", parser);
String[] invalidArguments = (String[]) asList(INVALID_ARGUMENT, INPUT_FILE).toArray();
// when
app.main(invalidArguments);
// then
verify(parser).printUsage(isA(PrintStream.class));
}
view raw AppTest2.java hosted with ❤ by GitHub
The last thing to check is if the exception was printed to the output stream when invalid arguments were passed.
Similarly we need to use a spy.

@Test
public void shouldPrintExceptionOnOutputStreamWhenInvalidArguments() {
// given
PrintStream outputStream = spy(System.out);
setField(app, "outputStream", outputStream);
String[] invalidArguments = (String[]) asList(INVALID_ARGUMENT, INPUT_FILE).toArray();
// when
app.main(invalidArguments);
// then
verify(outputStream).println("\"" + INVALID_ARGUMENT + "\" is not a valid option");
}
view raw AppTest3.java hosted with ❤ by GitHub
The whole project can be found at github.

Sunday, 19 May 2013

Testing JavaScript with Jasmine

When developing a web application, sooner or later we need to deal with JavaScript. To keep high code quality we must write unit tests.
Jasmine is a nice testing framework that allows us to write tests in a BDD manner.
Let's see how to use it in our project.

First of all, we need to incorporate it into our build. There is a maven plugin that will let us execute JavaScript tests within test build phase. We will add it to our pom.xml :
 
<plugin>
<groupId>com.github.searls</groupId>
<artifactId>jasmine-maven-plugin</artifactId>
<version>1.3.1.2</version>
<executions>
<execution>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>

Apart from executing tests within build, we can run them during development phase.
The maven goal jasmine:bdd starts the Jetty server and under http://localhost:8234 URL we can see the results of executing test fixtures.
Reloading the page will execute the latest version of our code.

Let's see it in action. We have a simple JavaScript function residing in src/main/javascript/simple.js that increments the number passed as an argument.

var increment = function (number) {
if (isNaN(number)) {
throw 'Argument ' + number + ' is not a number'
}
return number + 1
}
incrementer = {
increment: increment
}
view raw incrementer.js hosted with ❤ by GitHub
The test (in Jasmine called spec) is located in src/test/javascript/simple_spec.js:

describe('incrementer tests', function() {
it ('should be defined', function() {
expect(incrementer).toBeDefined()
})
it ('should increment', function() {
expect(incrementer.increment(1)).toEqual(2)
})
it ('should throw exception when argument is not a number', function() {
var illegalArg = 'stringy'
var thrown
try {
incrementer.increment(illegalArg)
} catch (e) {
thrown = e
}
expect(thrown).toBe('Argument ' + illegalArg + ' is not a number')
})
})
A test begins with a call to the global Jasmine function describe. 
Then the specific test cases are defined by calling function it.
We're testing the sunny day scenario and border case that throws the exception.
At the beginning as a sanity check we can test if the function is defined.
Jasmine has a lot of advanced features, we can even use mocks, expectations and argument matchers.

Nothing stops us from writing tests first and going through red-green-refactor cycle.
Let's write a function that will count the number of specific elements on our page.

The behaviour of our component is defined by following spec:

describe('counter tests', function() {
it ('should be defined', function() {
expect(elementCounter).toBeDefined()
})
it ('should count 0 elements', function() {
expect(elementCounter.count()).toEqual(0)
})
it ('should count 1 element', function() {
var container = document.createElement('div')
container.setAttribute('id','myId')
document.body.appendChild(container)
expect(elementCounter.count()).toEqual(1)
})
})
view raw counter_spec.js hosted with ❤ by GitHub
Before testing the case with one element, we're adding it to the document. In more complex case we could load the HTML content.

To implement, it we'll use jquery (we need to add it to src/main/javascript):

var count = function () {
return $('#myId').length
}
elementCounter = {
count: count
}
view raw counter.js hosted with ❤ by GitHub
After executing mvn test we'll see a nice output:

-------------------------------------------------------
J A S M I N E S P E C S
-------------------------------------------------------
[INFO]
counter tests
should be defined
should count 0 elements
should count 1 element
incrementer tests
should be defined
should increment
should throw exception when argument is not a number
Results: 6 specs, 0 failures
view raw output hosted with ❤ by GitHub
Whole project can be found at github.

Thursday, 25 April 2013

Testing Spring Integration

Adding Spring Integration to our project (apart from many advantages) brings drawbacks as well. One of them is more difficult testing.

Let's try to test the flow from MessageRouter to Persister via JSONHandler (please refer to the previous post

First of all, it won't be a unit test but an integration test as we need to start-up the spring context and test the flow between components.

Let's prepare spring xml config:

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="
http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.1.xsd">
<import resource="../../../spring-context.xml"/>
<bean id="messageRouter" class="org.mockito.Mockito" factory-method="mock">
<constructor-arg value="pl.mjedynak.spring.MessageRouter"/>
</bean>
<bean id="XMLHandler" class="org.mockito.Mockito" factory-method="mock">
<constructor-arg value="pl.mjedynak.spring.XMLHandler"/>
</bean>
<bean id="JSONHandler" class="org.mockito.Mockito" factory-method="mock">
<constructor-arg value="pl.mjedynak.spring.JSONHandler"/>
</bean>
<bean id="persister" class="org.mockito.Mockito" factory-method="mock">
<constructor-arg value="pl.mjedynak.spring.Persister"/>
</bean>
</beans>
We're importing the application config and overriding beans with mockito mocks.

The test would look like:

@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
public class AppIntegrationTest {
@Autowired
private MessageChannel messageChannel;
@Autowired
private MessageRouter messageRouter;
@Autowired
private JSONHandler jsonHandler;
@Autowired
private XMLHandler xmlHandler;
@Autowired
private Persister persister;
@Before
public void resetMocks() {
reset(messageRouter, jsonHandler, xmlHandler, persister);
}
@Test
public void shouldRouteToPersisterViaJSONHandler() {
String message = "json message";
given(messageRouter.route(message)).willReturn("JSONChannel");
given(jsonHandler.process(message)).willReturn(message);
messageChannel.send(MessageBuilder.withPayload(message).build());
verify(persister).persist(message);
}
}
Spring injects all the required components. We're mocking the behaviour, sending message to the channel and then verifying. Mocks need to be reset as well if we want to have more test methods.

By looking at the source code we don't see which object is mocked. To solve this problem (and simplify the code as well) we can use springockito.
Then in our config file we only need to import the application config without overriding any beans.

The overriding part is done in the test by using annotations:

@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(loader = SpringockitoContextLoader.class)
@DirtiesMocks(classMode = DirtiesMocks.ClassMode.AFTER_EACH_TEST_METHOD)
public class AppIntegrationTest {
@Autowired
private MessageChannel messageChannel;
@Autowired
@ReplaceWithMock
private MessageRouter messageRouter;
@Autowired
@ReplaceWithMock
private JSONHandler jsonHandler;
@Autowired
@ReplaceWithMock
private XMLHandler xmlHandler;
@Autowired
@ReplaceWithMock
private Persister persister;
@Test
public void shouldRouteToPersisterViaJSONHandler() {
String message = "json message";
given(messageRouter.route(message)).willReturn("JSONChannel");
given(jsonHandler.process(message)).willReturn(message);
messageChannel.send(MessageBuilder.withPayload(message).build());
verify(persister).persist(message);
}
}
To use springockito we need to change the context loader to SpringockitoContextLoader. 
The @ReplaceWithMock annotation is self-descriptive.
The context is dirtied after execution of each test method which is basically equivalent to resetting the mocks.
Whole project can be found at github.

Wednesday, 20 March 2013

Spring Integration vs plain Java code

Adding frameworks to our system may solve some problems and cause too much complexity at the same time.

Let's imagine a simple flow:


We have a MessageRouter object that routes received message to either XMLHandler or JSONHandler. The processed message in both handlers are then passed to Persister that is storing the message.

Modelling it is not that difficult. Here's our MessageRouter class:

public class MessageRouter {
private XMLHandler xmlHandler = new XMLHandler();
private JSONHandler jsonHandler = new JSONHandler();
public void route(String message) {
if (message.startsWith("<?xml")) {
xmlHandler.process(message);
} else {
jsonHandler.process(message);
}
}
}
Handlers:
public class XMLHandler {
private Persister persister = new Persister();
public void process(String message) {
System.out.println("Processing xml message: " + message);
persister.persist(message);
}
}

public class JSONHandler {
private Persister persister = new Persister();
public void process(String message) {
System.out.println("Processing json message: " + message);
persister.persist(message);
}
}
And finally persister:
public class Persister {
public void persist(String message) {
System.out.println("persisting " + message);
}
}
We can see it in action by running the following class:
public class App {
public static void main(String[] args) {
MessageRouter router = new MessageRouter();
router.route("{'json'}");
router.route("<?xml/>");
}
}
view raw AppPlain.java hosted with ❤ by GitHub
Everything seems fine here, but we have a problem - tight coupling.
Message router knows about handlers and handlers know about persister.
We would gain loose coupling and high cohesion if they weren't aware of each other which would automatically make them concentrate on one task only.

Spring Integration can help us achieve this.
It is a separate module of Spring, enabling lightweight messaging within application and supporting Enterprise Integration Patterns.

All the arrows from the above picture will be represented as channels.

The XML Spring context configuration for the channels looks pretty straightforward:

<int:channel id="messageChannel"/>
<int:channel id="XMLChannel"/>
<int:channel id="JSONChannel"/>
<int:channel id="persisterChannel"/>
view raw channels.xml hosted with ❤ by GitHub
Actually it's not even required as Spring can create them by default.

Apart from the channels, our MessageRouter only role is to return the channel name to which the message will be passed.
@Component
public class MessageRouter {
@Router
public String route(String message) {
return message.startsWith("<?xml") ? "XMLChannel" : "JSONChannel";
}
}
Also the handlers and persister need to become Service Activators (their methods will be invoked when the message will go through the channel).
@Component
public class JSONHandler {
@ServiceActivator
public String process(String message) {
System.out.println("Processing json message: " + message);
return message;
}
}
@Component
public class XMLHandler {
@ServiceActivator
public String process(String message) {
System.out.println("Processing xml message: " + message);
return message;
}
}
view raw XMLHandler.java hosted with ❤ by GitHub
@Component
public class Persister {
@ServiceActivator
public void persist(String message) {
System.out.println("persisting " + message);
}
}
view raw Persister.java hosted with ❤ by GitHub
The config for those:
<int:router input-channel="messageChannel" ref="messageRouter"/>
<int:service-activator input-channel="XMLChannel" output-channel="persisterChannel" ref="XMLHandler"/>
<int:service-activator input-channel="JSONChannel" output-channel="persisterChannel" ref="JSONHandler"/>
<int:service-activator input-channel="persisterChannel" ref="persister"/>
To see it in action we can run the following class:
public class App {
public static void main(String[] args) {
ApplicationContext context = new ClassPathXmlApplicationContext("spring-context.xml");
MessageChannel channel = context.getBean("messageChannel", MessageChannel.class);
channel.send(MessageBuilder.withPayload("{'json'}").build());
channel.send(MessageBuilder.withPayload("<?xml/>").build());
}
}
Notice that the code is concise and simple.
The components do not depend on each other.
What is more, if we wanted to modify the flow and move Persister component in front of the MessageRouter, we would need to change the xml config to:

<int:service-activator input-channel="XMLChannel" output-channel="nullChannel" ref="XMLHandler"/>
<int:service-activator input-channel="JSONChannel" output-channel="nullChannel" ref="JSONHandler"/>
<int:service-activator input-channel="persisterChannel" output-channel="messageChannel" ref="persister"/>
Changing the flow in the first version that is using plain Java code would require much more modifications.

Nevertheless we increased the complexity of our application. Now we depend on a framework and we need to maintain additional configuration in an XML file.

Another big disadvantage is testing. We could easily unit test the previous code using mocks. Now we need to test the Java code and Spring Integration configuration as well, which is not that simple.
I'll show how to do it in the next post.

Friday, 15 February 2013

MarkLogic Java client

MarkLogic is a NoSQL document database that allows to handle XML efficiently.

Let's take a look how to setup MarkLogic database instance on a local machine and write a simple application that will perform CRUD and searching operations on XML documents.

First of all, we will need to download MarkLogic server (account is required).
Installation and starting procedures are described here - they're pretty straightforward.
When server is already started, we need to create a new database with REST API instance and a user having write access - follow this link.
REST is used by the Java client as a communication protocol but we can also use it manually in our browser.

Once database is created we can start writing the client code.

Let's start with setting up required dependencies:
<dependencies>
<dependency>
<groupId>com.marklogic</groupId>
<artifactId>client-api-java</artifactId>
<version>1.0-2</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.xmlmatchers</groupId>
<artifactId>xml-matchers</artifactId>
<version>0.10</version>
<scope>test</scope>
</dependency>
</dependencies>
<repositories>
<repository>
<id>dmc</id>
<name>MarkLogic Developer Community</name>
<url>http://developer.marklogic.com/maven2/</url>
</repository>
</repositories>
To use MarkLogic Java client we need to specify MarkLogic maven repository. We'll also use xml-matchers library to compare created XML documents.

Here's an example of XML document that will be representing person:
<person>
<name>Robin van Persie</name>
<age>29</age>
</person>
view raw person.xml hosted with ❤ by GitHub
Let's define an interface that will allow some simple CRUD and searching operations:
public interface PersonRepository {
void addPerson(String id, String person);
String getPerson(String id);
void removePerson(String id);
List<String> findByName(String name);
}
The sample implementation could look like:
public class MarkLogicPersonRepository implements PersonRepository {
private XMLDocumentManager documentManager;
private QueryManager queryManager;
public MarkLogicPersonRepository(XMLDocumentManager documentManager, QueryManager queryManager) {
this.documentManager = documentManager;
this.queryManager = queryManager;
}
public void addPerson(String id, String person) {
StringHandle handle = new StringHandle(person);
documentManager.write(id, handle);
}
public String getPerson(String personId) {
StringHandle handle = new StringHandle();
documentManager.read(personId, handle);
return handle.get();
}
public void removePerson(String personId) {
documentManager.delete(personId);
}
public List<String> findByName(String name) {
KeyValueQueryDefinition query = queryManager.newKeyValueDefinition();
queryManager.setPageLength(10); // LIMIT RESULT
query.put(queryManager.newElementLocator(new QName("name")), name);
SearchHandle resultsHandle = new SearchHandle();
queryManager.search(query, resultsHandle);
return getResultListFor(resultsHandle);
}
private List<String> getResultListFor(SearchHandle resultsHandle) {
List<String> result = new ArrayList<String>();
for (MatchDocumentSummary summary : resultsHandle.getMatchResults()) {
StringHandle content = new StringHandle();
documentManager.read(summary.getUri(), content);
result.add(content.get());
}
return result;
}
}
In order to perform CRUD operations we need to have DocumentManager object (in our case XMLDocumentManager as we're handling XML). It is thread-safe object (can be shared across multiple threads) and its usage is quite intuitive. Each operation needs a specific handle object - as our interface declared String, we'll use StringHandle that is being populated with the result by manager.

To do query operations QueryManager is required. There are many types of queries, we'll use searching by element value.
It's a little bit more complicated than simple CRUD operations - SearchHandle object is initially populated by running the query on query manager.
Then we're iterating over each SearchHandle's result represented by MatchDocumentSummary object and retrieve its URI, that is given to DocumentManager that reads full document. 
Please note that the number of returned documents has been limited to 10.

The integration test (it requires running MarkLogic server):
public class MarkLogicPersonRepositoryIntegrationTest {
private static final String NAME = "Robin van Persie";
private static final String SAMPLE_PERSON = "<person><name>" + NAME + "</name><age>29</age></person>";
private MarkLogicPersonRepository personRepository;
@Before
public void setUp() {
DatabaseClient client = DatabaseClientFactory.newClient("localhost", 8003, "rest-writer", "x", DIGEST);
personRepository = new MarkLogicPersonRepository(client.newXMLDocumentManager(), client.newQueryManager());
}
@Test
public void shouldAddAndRetrievePersonAsXmlDocument() {
// given
String personId = randomUUID().toString();
personRepository.addPerson(personId, SAMPLE_PERSON);
// when
String result = personRepository.getPerson(personId);
// then
assertThat(the(result), isEquivalentTo(the(SAMPLE_PERSON)));
}
@Test(expected = ResourceNotFoundException.class)
public void shouldRemovePerson() {
// given
String personId = randomUUID().toString();
personRepository.addPerson(personId, SAMPLE_PERSON);
// when
personRepository.removePerson(personId);
// then
personRepository.getPerson(personId);
}
@Test
public void shouldFindPersonByName() {
// given
personRepository.addPerson(randomUUID().toString(), SAMPLE_PERSON);
// when
List<String> result = personRepository.findByName(NAME);
// then
assertThat(the(result.get(0)), isEquivalentTo(the(SAMPLE_PERSON)));
}
}

In setUp() method DatabaseClientFactory creates DatabaseClient based on the given credentials (they need to be the same as the one used during setting up the database).
Once we have the client we can create managers needed by the implementation.

One thing to note: when manager cannot find the document it throws ResourceNotFoundException.

The whole project can be found at github.


Wednesday, 6 February 2013

Spring Data JPA sample project

In previous post I showed how to setup a sample project with JPA and Hibernate.
Even though it wasn't difficult, there was a main disadvantage - we need to do a lot of coding around our DAO objects even if we want only simple operations.
Spring Data JPA helps us reduce data access coding.

Let's start with defining dependencies:

<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-jpa</artifactId>
<version>1.1.0.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-jdbc</artifactId>
<version>3.2.0.RELEASE</version>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-entitymanager</artifactId>
<version>4.1.9.Final</version>
</dependency>
<dependency>
<groupId>org.hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<version>2.2.9</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-all</artifactId>
<version>1.3</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>3.2.0.RELEASE</version>
<scope>test</scope>
</dependency>
Compared to previous project there are more dependencies because of Spring. spring-test is needed to allow our test use the spring context. And this time we're going o use HSQLDB.

pesistence.xml  is much smaller because the persistence configuration will be defined in spring context.

<persistence xmlns="http://java.sun.com/xml/ns/persistence"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd" version="2.0">
<persistence-unit name="springJpaPersistenceUnit" transaction-type="RESOURCE_LOCAL">
<provider>org.hibernate.ejb.HibernatePersistence</provider>
<class>pl.mjedynak.model.Person</class>
<properties>
<property name="hibernate.hbm2ddl.auto" value="create"/>
</properties>
</persistence-unit>
</persistence>
view raw persistence.xml hosted with ❤ by GitHub
Please note that Hibernate is still the persistence provider. The Person class is exactly the same as before.

The context is defined as:

<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:jpa="http://www.springframework.org/schema/data/jpa"
xsi:schemaLocation="
http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.2.xsd
http://www.springframework.org/schema/context
http://www.springframework.org/schema/context/spring-context-3.2.xsd
http://www.springframework.org/schema/data/jpa
http://www.springframework.org/schema/data/jpa/spring-jpa.xsd">
<context:component-scan base-package="pl.mjedynak"/>
<jpa:repositories base-package="pl.mjedynak.repository"/>
<bean id="dataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName" value="org.hsqldb.jdbcDriver"/>
<property name="url" value="jdbc:hsqldb:file:sampledb"/>
<property name="username" value="sa"/>
<property name="password" value=""/>
</bean>
<bean id="entityManagerFactory" class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<property name="dataSource" ref="dataSource"/>
<property name="persistenceXmlLocation" value="persistence.xml"/>
<property name="persistenceUnitName" value="springJpaPersistenceUnit"/>
</bean>
<bean id="transactionManager" class="org.springframework.orm.jpa.JpaTransactionManager">
<property name="entityManagerFactory" ref="entityManagerFactory"/>
</bean>
</beans>
We need to specify dataSourceentityManagerFactory and transactionManager.
The pl.mjedynak package will be scanned by spring to do autowiring.
Spring Data JPA introduces the concept of repository which is higher level of abstraction than the DAO.
In our context we define a package with the repositories.

The only thing to do to be able to manage our Person class is to create an interface that will extend CrudRepository - it gives us basic operations like save, find, delete etc.


public interface PersonRepository extends CrudRepository<Person, Long> {
}
We can of course expand it with more sophisticated methods if we want.

The integration test is almost the same as before, except it needs to use spring context.

@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = "classpath:spring-context.xml")
public class PersonRepositoryIntegrationTest {
@Autowired
private PersonRepository personRepository;
@Test
public void shouldFindPreviouslySavedPerson() {
// given
Integer age = 22;
String name = "Charlie";
Person person = aPerson().
withAge(age).
withName(name).build();
personRepository.save(person);
// when
List<Person> result = (List<Person>) personRepository.findAll();
// then
assertThat(result, hasSize(1));
Person foundPerson = result.get(0);
assertThat(foundPerson.getAge(), is(age));
assertThat(foundPerson.getName(), is(name));
}
}


The whole project can be found at: https://github.com/mjedynak/spring-data-jpa-example

Friday, 25 January 2013

Hibernate JPA sample project

Here's a simple example of a project that uses JPA with Hibernate as default implementation.

Let's start with the required dependencies:

<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-entitymanager</artifactId>
<version>4.1.9.Final</version>
</dependency>
<dependency>
<groupId>org.apache.derby</groupId>
<artifactId>derby</artifactId>
<version>10.9.1.0</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-all</artifactId>
<version>1.3</version>
<scope>test</scope>
</dependency>
We're going to use the embedded Derby database.

We will also need the persistence.xml placed in META-INF directory on our classpath :

<persistence xmlns="http://java.sun.com/xml/ns/persistence"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd"
version="2.0">
<persistence-unit name="test">
<provider>org.hibernate.ejb.HibernatePersistence</provider>
<class>pl.mjedynak.model.Person</class>
<properties>
<property name="javax.persistence.jdbc.driver" value="org.apache.derby.jdbc.EmbeddedDriver"/>
<property name="javax.persistence.jdbc.url" value="jdbc:derby:test;create=true"/>
<property name="javax.persistence.jdbc.user" value="root"/>
<property name="javax.persistence.jdbc.password" value="root"/>
<property name="hibernate.hbm2ddl.auto" value="create"/>
</properties>
</persistence-unit>
</persistence>
view raw persistence.xml hosted with ❤ by GitHub
We're setting Hibernate as our persistence provider and Derby as JDBC driver.
Entity class Person is also specified as belonging to this persistent unit.

It's defined as:

@Entity
public class Person {
@Id
@GeneratedValue
private Long id;
private String name;
private Integer age;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public Integer getAge() {
return age;
}
public void setAge(Integer age) {
this.age = age;
}
@Override
public int hashCode() {
return Objects.hash(id, name, age);
}
@Override
public boolean equals(Object obj) {
if (obj == null) {
return false;
}
if (getClass() != obj.getClass()) {
return false;
}
final Person other = (Person) obj;
return Objects.equals(this.id, other.id) && Objects.equals(this.name, other.name) && Objects.equals(this.age, other.age);
@Override
public String toString() {
return "Person{" +
"id=" + id +
", name='" + name + '\'' +
", age=" + age +
'}';
}
}
view raw Person.java hosted with ❤ by GitHub

In order to manage Person we need some kind of DAO.  Let's define an interface PersonDao:

public interface PersonDao {
List<Person> findAll();
void addPerson(Person person);
}
view raw PersonDao.java hosted with ❤ by GitHub
For simplicity it has only methods for adding person and finding all persons.

The sample implementation could look like:

public class PersonDaoJpa implements PersonDao {
private EntityManager entityManager;
public PersonDaoJpa(EntityManager entityManager) {
this.entityManager = entityManager;
}
@Override
public List<Person> findAll() {
List<Person> result = new ArrayList<>();
EntityTransaction transaction = entityManager.getTransaction();
try {
transaction.begin();
result = entityManager.createQuery("SELECT p FROM Person p").getResultList();
transaction.commit();
} catch (Exception e) {
transaction.rollback();
}
return result;
}
@Override
public void addPerson(Person person) {
EntityTransaction transaction = entityManager.getTransaction();
try {
transaction.begin();
entityManager.persist(person);
transaction.commit();
} catch (Exception e) {
transaction.rollback();
}
}
}

And the most important - integration test that checks if everything is glued together correctly:

public class PersonDaoJpaIntegrationTest {
private PersonDaoJpa personDaoJpa;
private EntityManagerFactory entityManagerFactory;
private EntityManager entityManager;
@Before
public void setUp() {
entityManagerFactory = Persistence.createEntityManagerFactory("test");
entityManager = entityManagerFactory.createEntityManager();
personDaoJpa = new PersonDaoJpa(entityManager);
}
@After
public void tearDown() {
entityManager.close();
entityManagerFactory.close();
}
@Test
public void shouldFindPreviouslySavedPerson() {
// given
Integer age = 22;
String name = "Charlie";
Person person = aPerson().
withAge(age).
withName(name).build();
personDaoJpa.addPerson(person);
// when
List<Person> result = personDaoJpa.findAll();
// then
assertThat(result, hasSize(1));
Person foundPerson = result.get(0);
assertThat(foundPerson.getAge(), is(age));
assertThat(foundPerson.getName(), is(name));
}
}

The whole project can be found at:  https://github.com/mjedynak/hibernate-jpa-example

To sum up:
it's quite easy to create JPA with Hibernate project, however every entity class needs it's own DAO (unless we use a generic one).
In the next post I'll take a look at Spring Data Jpa project that solves that impediment.