We will be using Elastic Stack (Elasticsearch, Logstash and Kibana) on Mac OS X in this tutorial.
Before we proceed, we’ll need the following:
Optional:
1 . Extract Filebeat into the server where your Java application server resides
2 . Open up filebeat.yml
3 . Under the filebeat –> prospectors, add the following config:
-
paths:
- /var/log/yourApp/yourlog*
type: log
fields:
application: your-app
4 . Under the output
section, enter your Logstash host. It should look something like this hosts: ["localhost:5044"]
. You can change the index name by adding index: your_index
in the same Logstash section.
5 . Extract Logstash to your Logstash server. This can be in the same machine as Filebeat if you like.
6 . In your Logstash server, create logstash.conf
in the Logstash application folder.
7 . Put the following into your config file.
# The # character at the beginning of a line indicates a comment. Use
# comments to describe your configuration.
input {
beats {
port => 5044
type => "log4j"
codec => multiline {
# Grok pattern names are valid!
pattern => "^%{TIMESTAMP_ISO8601} "
negate => true
what => previous
}
}
# optional: add this section if you want Logstash to collect data from a table eg. audit table....
# if not, comment or remove this section
jdbc {
# Postgres jdbc connection string to our database, mydb
jdbc_connection_string => "jdbc:postgresql://localhost:5432/yourdatabase"
# The user we wish to execute our statement as
jdbc_user => "your_postgres_username"
jdbc_password => "your_postgres_password"
# The path to our downloaded jdbc driver
jdbc_driver_library => "postgresql-9.3-1102-jdbc41.jar"
# The name of the driver class for Postgresql
jdbc_driver_class => "org.postgresql.Driver"
# our query
statement => "SELECT * from your_audit_table"
}
}
output {
stdout { codec => json_lines }
elasticsearch {
# point to your elasticsearch host
hosts => ["localhost:9200"]
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
8 . Extract Elasticsearch and run bin/elasticsearch
9 . Go to the Logstash application folder that you’ve extracted and run bin/logstash -f logstash.conf
10 . Go to your FileBeat folder and import the template into your Elasticsearch by running curl -XPUT 'http://localhost:9200/_template/filebeat?pretty' -d@filebeat.template.json
11 . Now, we run FileBeat to delivery the logs to Logstash by running sudo ./filebeat -e -c filebeat.yml -d "publish"
12 . We need a front end to view the data that’s been feed into Elasticsearch. Hence we’ll need Kibana. Extract Kibana and edit config/kibana.yml.
13 . If your Elasticsearch resides on another server, uncomment elasticsearch.url
.
14 . Now close the file and run Kibana with bin/kibana
15 . Open up http://localhost:5601 and you’ll be asked to enter your index name. Use the index name from step 4. For example yourIndex-*
.
16 . Once that is done, go to the Discover
tab and change the time frame on the top right corner.
Your overall architecture should look something like this:
]]>git clone https://github.com/pugnusferreus/py_monitoring.git
EMAIL_LIST
variable, SERVER
and FROM
variable.*/5 * * * * /<your path>/monitor.py
Note: I’ve only tested this script on Python 2.6 (since my server came with that version of Python). I’m not sure if it’ll work on Python 3.
]]>Here are some tips that I’ve learnt over the years to get out of the abyss.
Always communicate – Love Agile or hate it, it teaches me to constantly communicate with your peers, QA, technical lead and your project manager. If you run into a road block, keeping the problem to yourself won’t solve anything. Give your one day to study the problem. If you can’t solve it, talk to your peers or technical lead as they’ve probably solve the problem before. Communication isn’t limited to verbal communication. It can be emails, code comments and even code commit messages.
Be proactive – To be proactive, we need to be lazy. Yeap! Automate your stuff. If something becomes too repetitive, automate. Your software annoys you too much? Fix is for the better.
Respect the testers – Earlier in my career, I hated QA testers alot. Everytime they file a bug, I felt insulted. I would rather let the testers to weed out all the bugs rather than releasing the bugs into production. So respect your testers and work with them closely. If there are dispute in the specification, bring a third party in and have a civilized conversation.
Take pride in what you do – Sure, you’re only a small little individual in your company, but if you do not feel proud and take pride in what you do, how would you expect the rest of the company to respect you and the project that you’re working on? Taking pride in your project will make you more proactive and strife for improvements.
When you say it is done, please make sure that it works – Most developers claim that their task is done. When asked if they’ve done an end to end testing, the answer is always no. When you mark your ticket as “Resolved”, please make sure that it really works. Sure, we’ll find bugs but please make an effort to test your work.
Always think ahead – The world evolves and so does our software. Always think about how you can improve your software or your software development process. Read more and learn more. As long as we live, we’ll never stop learning.
Do not blame – It is easy for us to vent our frustrations on some one else. Try looking at your own code 2 years ago and cringe.
Be a teacher – The more you teach, the faster you learn. It is also a great way to learn from your peers as well.
Here are the 10 reasons that code reviews should be done in software development.
Programmers are humans, and humans make mistakes. It is good to have an extra pair of eyes to review our code.
We can learn each other’s programming chops.
Most of the time, we’ll be working on different modules. By doing code reviews, there will be spread of knowledge.
By being aware that our code will be reviewed by our peers, we’ll be more conscious when we program hence, leading to better code quality.
Code review can also improve our communication skills. We’ll have to be prepared to answer the questions that will be asked by your peers.
Possibly find more bugs related to scenarios that you never think of.
Effective training tool for junior developers.
Code reviews has virtually no risk and is a good risk management.
By giving constant feedback on the source level, we will stop making the same mistakes over and over again.
Improves code readability and code documentation.
2 . On your terminal, run git clone https://github.com/pugnusferreus/spring-rest.git
3 . In the spring-rest folder, run mvn -e jetty:run -Djetty.port=8080
4 . On your browser, go to http://localhost:8080/rest/someEndPoint
and you’ll get the following response {"key":"Hello from a manager!"}
.
I’ll not explain step by step on how to wire up all these since there are a dozen of articles on how to wire up a REST application with Spring 4. I’ll be highlighting on the things that we should be looking at.
1 . src/main/webapp/WEB-INF/web.xml
basically tells your Java Web App on which spring context it should use and the URL pattern mapping. In this example, we want to configure all our REST calls with /rest/*.
2 . src/main/webapp/WEB-INF/applicationContext.xml
and src/main/webapp/WEB-INF/manager-context.xml
is where we declare our spring beans. I deliberately separate them up so that manager classes will be declared in the manager-context.xml.
3 . src/main/resources/log4j2.xml
is where we configure our logging. We’re using Log4J2 in this example. This XML tells LOG4J to log spring-rest.log and to the terminal console.
4 . pom.xml
is our Maven build file.
1 . com.progriff.managers.TestManager
is our manager class. We should always put our business logic in a manager class.
2 . com.progriff.controllers.TestController
is our controller. Note that there is an @Autowired
on top of the TestManager class. Spring will automatically wire up TestManager with TestController. Do note that your variable name must be the same as the bean id in the manager-context.xml file.
3 . In TestController, for each methods that are to be mapped to a URL, we’ll need @RequestMapping
and @ResponseBody
. You can set the REST URL path, Request Method and the Media Type that this REST will produce in the @RequestMapping annotation. Do note that when we return the java.util.Map in that method, Jackson will automatically convert the object into JSON.
-s
command doesn’t work anymore because you might end up with stretchy videos.
A few weeks back I’ve been tasked to make some changes so that FFmpeg will scale the video based on height. For example, if a video is 640x360, and the given height to encode is 720, the output video should be 1280x720.
After doing some research, I found a StackOverflow post with the solution. The answer was to use FFmpeg’s video filter -vf scale=-1:360
. That only works when the scaled width is an even number. If the scaled width is an odd number, you’ll get the following error height not divisible by 2
.
The same post also states that to scale the height based on the width, you’ll need to do scale=640:trunc(ow/a/2)*2
to prevent the height not divisible by 2
error. So if you need to scale by height, the command would be
-vf scale=trunc(oh*a/2)*2:720
Unfortunately, we’re also using FFmpeg to do watermarking on videos and you cannot have 2 video filter command in FFmpeg. Luckily, you can merge both the video filter command into one video filter command by doing the following:
-vf "movie=watermark.png [watermark]; [in][watermark] overlay=10:main_h-overlay_h-10 scale=trunc(oh*a/2)*2:720 [out]"
Like me, if you tried to install protocol buffer 2.4.1 on OS X Mavericks, you’ll get a compilation error. This is how I fixed it.
1 . Open terminal.app and run brew update
2 . Get the hash for ProtoBuf 2.4.1 by running brew versions protobuf
3 . Run cd `brew —prefix`
4 . Get the 2.4.1 brew installation script by running git checkout $hash_shown_in_step2 Library/Formula/protobuf.rb
5 . Attempt to install protobuf 2.4.1 by running brew install protobuf
and you’ll get the error like the following:
1 2 3 4 |
|
6 . Go to HomeBrew’s source directory folder by running cd /Library/Caches/Homebrew
7 . Unzip protobuf-2.4.1.tar.bz2
8 . Open src/google/protobuf/message.h
9 . Go to line 115 and change
1 2 3 4 5 6 |
|
To
1 2 3 4 5 6 |
|
10 . In the protobuf directory, run tar -cvjf protobuf-2.4.1.tar.bz2 .
11 . Obviously after modifying the code, the SHA sum will be different now. Get the new SHA sum by running shasum protobuf-2.4.1.tar.bz2
abd then run brew edit protobuf
and change the SHA1 with the new SHA sum
12 . Finally run brew install protobuf
again and enjoy your Protocol Buffer 2.4.1
file.flush()
after a file.write()
.
On POSIX systems, it seems that the file.read()
method will automatically commit the changes onto the file.
On Windows systems, it’ll add a null character instead of the character that you’re writing into the file.
So remember kids! Do a .flush()
after a .write()
!
What you need:
1 . Install python and py2exe
2 . Create a simple python script. Name it hello.py
1
|
|
3 . Create a setup.py
file
1 2 3 4 |
|
4 . Open cmd.exe
and go to your working directory.
5 . We need to install the dependencies that are required by our python script. Run python setup.py install
6 . Then, we convert the python script to an exe with python setup.py py2exe
7 . Your hello.exe
will be in the dist directory.
1 2 |
|
Here’s a short tutorial on how to create a Http Live Stream v4 with AES encryption from a MOV file with multiple language track. Here are the list of softwares that we’ll be using for this tutorial:
If you’re on Windows:
HandBrakeCLI -i VIDEO_FILE_NAME.mov -o VIDEO_FILE_NAME.mp4 --preset=iPad -d slow
ffmpeg -i VIDEO_FILE_NAME.mp4 -acodec libfaac -vcodec libx264 -an -map 0 -f segment -segment_time 10 -segment_list segment.m3u8 -segment_format mpegts -vbsf h264_mp4toannexb -flags -global_header stream-%d.ts
openssl rand 16 > static.key
openssl aes-128-cbc -e -in stream-1.ts -out tsAes\stream-1.ts -p -nosalt -iv 0 -K YOUR_AES_KEY
. Your encrypted TS files should be in the tsAes folder.tsAes
folder.You’ll have to know which audio track belongs to which language. The easiest way to check the language is to play the video via VLC and identify the language of each audio track. Let’s assume that Stream 0:1 is English and Stream 0:2 is Chinese.
ffmpeg -i VIDEO_FILE_NAME.mov -map 0:1 -ac:a:0 2 -acodec libfaac -vn track_en.aac
ffmpeg -i VIDEO_FILE_NAME.mov -map 0:2 -ac:a:0 2 -acodec libfaac -vn track_zh.aac
ffmpeg -i track_en.aac -map 0:1 -ac:a:0 2 -acodec libfaac -vcodec copy -dcodec copy -vn -f segment -segment_time 10 -segment_list segment.m3u8 -segment_format mpegts -vbsf h264_mp4toannexb -flags -global_header stream-%d.ts
track_zh.aac
file to TS files. ffmpeg -i track_zh.aac -map 0:2 -ac:a:0 2 -acodec libfaac -vcodec copy -vn -f segment -segment_time 10 -segment_list segment.m3u8 -segment_format mpegts -vbsf h264_mp4toannexb -flags -global_header stream-%d.ts
Edit: We’ll need to do segmenting and the audio extraction in one line to preserve the PTS values. Incorrect PTS values will result in audio/video out of sync.
So finally, you should have the following directory structure.
On the root directory, create a stream.m3u8 file. It should look something like this:
1 2 3 4 5 6 |
|
If you don’t have a Github account, do it!
git clone git://github.com/imathis/octopress.git octopress
cd octopress
bundle install
rake install
rake setup_github_pages
and enter your repository detailsgit remote -v
rake generate
rake deploy
git push origin source
Here is a sample from my build.xml
<target name="run">
<java fork="true"
classname="com.progriff.jhaml.BatchJHaml"
outputproperty="javaoutput">
<classpath>
<path refid="classpath"/>
<path location="${dist}/BatchJHaml.jar"/>
</classpath>
<arg value="${haml.path}"/>
<arg value="${haml.layout.path}"/>
<arg value="${haml.output.path}"/>
<arg value="${haml.output.extension}"/>
<arg value="${haml.javascript.path}"/>
<arg value="${haml.stylesheet.path}"/>
<arg value="${haml.recursive}" />
</java><echo message="${javaoutput}" />
</target>
For example if you have the following in your haml folder,
haml |-- someDir `- baz.haml `foo.haml `- bar.haml
The resulting jsp folder will look like this
jsp |-- someDir `- baz.jsp `foo.jsp `- bar.jsp]]>
java.lang.IllegalArgumentException: property
"javax.xml.stream.isReplacingEntityReferences" not supported
at com.caucho.xml.stream.XMLInputFactoryImpl.setProperty
(XMLInputFactoryImpl.java:265)
at com.microsoft.windowsazure.services.core.storage.utils
.Utility.createXMLStreamReaderFromStream(Utility.java:321)
Resin Web Server uses it’s own XMLInputFactory
implementation called com.caucho.xml.stream.XMLInputFactoryImpl
.
According to this article, you can override the
implementation by having <system-property/>
following in your resin.conf
.
Put the following to use the JDK’s version of XMLInputFactoryImpl
<system-property javax.xml.stream.XMLInputFactory=
"com.sun.xml.internal.stream.XMLInputFactoryImpl" />
Problem solved? Not really. Tried running the REST service again and I got the following:
com.microsoft.windowsazure.services.core.storage.StorageException:
XML specified is not syntactically valid.
After looking at the Azure SDK for Java’s source,
I’ve found out that the XMLOutputFactory and XMLStreamWriter was unable to generate the XML request body in
com.microsoft.windowsazure.services.queue.client
.QueueRequest.generateMessageRequestBody
Why? If you do a simple system out on the class name, resin is using it’s own implementation for XMLOutputFactory and XMLStreamWriter. So add the following as well into your resin.conf
<system-property javax.xml.stream.XMLOutputFactory=
"com.sun.xml.internal.stream.XMLOutputFactoryImpl" />
<system-property javax.xml.stream.XMLStreamWriter=
"com.sun.xml.internal.stream.writers.XMLStreamWriterImpl" />
tl;dr? Add the following in your resin.conf
<system-property javax.xml.stream.XMLInputFactory=
"com.sun.xml.internal.stream.XMLInputFactoryImpl" />
<system-property javax.xml.stream.XMLOutputFactory=
"com.sun.xml.internal.stream.XMLOutputFactoryImpl" />
<system-property javax.xml.stream.XMLStreamWriter=
"com.sun.xml.internal.stream.writers.XMLStreamWriterImpl" />
After that, my code works perfectly. I’m sure there are other weird problems as well, I’ll keep you posted if I’ve found anymore weird stuff.
]]>So here’s the problem, after calling a session.update
on object A with the updated object B,
the change was not persisted into the database.
After hours of Googling and some help from Michael who is a fellow colleague of mine, I found the problem.
According to this article,
Hibernate will only do a UPDATE
statement after session.flush()
.
During this operation, Hibernate will compare the original object and the object to be updated.
This article explains how Hibernate compares the object.
The fix? In your Hibernate user type, override the deepCopy
and return another instance of list.
@Override
public Object deepCopy(Object value) throws HibernateException
{
// if value is null, return null
if(value == null)
{
return null;
}
// convert the value to a List first
List<ObjectB> objectBList = (List<ObjectB>)value;
// create a new Arraylist and add all the objects
List<ObjectB> newObjectBList = new ArrayList<ObjectB>();
for(ObjectB objectB : objectBList)
{
newObjectBList.add(objectB);
}
return newObjectBList;
}
At Movideo, we’ve been haunted by videos that goes out of sync with it’s audio. The symptom goes something like this:
We’ve decided to use MEncoder to normalize the original video which is in .mov format.
The command
mencoder -ovc copy -af volnorm=1 -oac libmp3lame input.mov -o output.mov
creates a mov file which Quick Time won’t recognize at all. Tried viewing it with Windows Media Player and there’s audio but no video. After that, I upload the video via Movideo’s Admin interface. The re-encoded video still goes out of sync.
The solution to this is to re-encode the audio with FFmpeg with
ffmpeg -i input.mov -vcodec copy -acodec libmp3lame output.mov
Now, output.mov can be viewed by Quick Time and works fine on Windows Media Player. Then, I tried uploading the output.mov via Movideo’s Admin interface and the video doesn’t go out of sync anymore.
]]>Due to this limitation, I created BatchJHaml. If you’re using BatchJHaml as a standalone app, it’s pretty straight forward. But what if you want to include BatchJHaml into your Ant build script? Here’s what you need to do.
1.) Run git clone git@github.com:pugnusferreus/batchjhaml.git
in your terminal
2.) cd to BatchJHaml
3.) Run ant
and BatchJHaml.jar
will appear in the dist
directory.
4.) Copy and paste BatchJHaml.jar
in your local lib directory.
5.) Copy the following into your local lib directory as well. You can find them in the BatchJHaml/lib
folder.
If you already have the following, you can ommit this step.
If you have a later version of the following jars, you can use them as well.
commons-io-1.4.jar
commons-lang-2.5.jar
guava-r06.jar
jhaml-0.1.2.jar
markdownj-1.0.2b4-0.3.0.jar
6.) Copy the following build target into your build.xml
<target name="compile-haml">
<echo message="Converting haml files into jsp ..." />
<java fork="true" classname="com.progriff.jhaml.BatchJHaml">
<classpath>
<fileset dir="${library.home}">
<include name="**/commons-io-*.jar" />
<include name="**/commons-lang-*.jar" />
<include name="**/guava-*.jar" />
<include name="**/jhaml-*.jar" />
<include name="**/markdownj-*.jar" />
</fileset>
<path location="${library.home}/BatchJHaml.jar"/>
</classpath>
<arg value="~/YourProject/haml"/>
<arg value="~/YourProject/haml/layouts"/>
<arg value="~/YourProject/jsp"/>
<arg value="jsp"/>
<arg value="~/YourProject/javascripts"/>
<arg value="~/YourProject/stylesheets"/>
</java>
</target>
7.) In your build
target, add compile-haml
to your depends
. Example:
<target name="build" depends="clean, prepare,compile, compile-haml">
And enjoy your Haml.
]]>1.) Some stuff from the Sass plugin has been deprecated in Rails3.
Remove the plugin by removing the directory in vendor/plugins and add https://github.com/jasoncodes/hassle.git
in your Gemfile.
2.) Move filter_parameter_logging
in application_controller.rb
to /config/application.rb
3.) Remove the test folder because we’ll be using Rspec
4.) Add the following lines into your Gemfile. These gems should be loaded in Dev and Testing environments only.
group :development, :test do
gem 'rspec-rails'
gem 'spork'
gem 'awesome_print', :require => 'ap'
gem 'mocha'
gem 'shoulda'
gem 'vcr'
gem 'webmock', :require => false
end
5.) Run bundle install
and then run rails g rspec:install
.
This will create the rspec folder and the rake file to create a test database.
6.) The structure of your spec/
directory should mirror that of app/
.
For example, the spec file for app/models/location.rb
would be spec/models/location_spec.rb
.
7.) Create your rspec test! Need example? See here.
8.) Now, in the lunchpicker directory, type bundle exec rspec spec
.
This will run the tests. Since we’re testing the controllers and models, rspec would need to bootstrap all the Rails stuff.
The result? Slowness.
9.) To overcome this, we’ll be using spork to bootstrap the Rails libraries.
Open another terminal window and cd
to the lunchpicker folder.
Type in bundle exec spork rspec
.
10.) Open another window and cd
into the lunchpicker folder.
Type in bundle exec rspec --drb spec
.
Notice that the tests run faster.
If you change any Rails related config, remember to restart spork.
What’s VCR? VCR records your test suite’s HTTP interactions and replay them during future test runs for fast, deterministic, accurate tests. We’re interacting with Google Weather which is an external API.
ps. lunchpicker is now Rails 3
]]>Wife had some plans for the entire afternoon and I’ve decided to dedicate my entire Saturday afternoon on migrating Lunchpicker from Rails 2 to Rails 3. Here’s a blog post on my adventure. Thanks @jasoncodes for helping!
1 . Install RVM. RVM allows you to install, manage and work with multiple ruby environments. For example, you can have project A running in Ruby 1.8.x and project B running in Ruby 1.9.x
2 . For lunchpicker, here is my .rvmrc file
<div>
$ cat .rvmrc
rvm --create 1.9.2@lunchpicker</code></pre>
3 . Now, we need to intall Bundler.
Bundler manages your application’s dependencies via Gemfile.
Here’s a sample of lunchpicker’s Gemfile
source 'http://rubygems.org'
gem 'rails', '3.0.9'
gem 'google_weather', :git => 'http://github.com/Ennova/google-weather.git'
gem 'informal'
gem 'haml'
gem 'httparty'
gem 'authlogic', :git => 'http://github.com/radar/authlogic.git'
gem 'pg', :require => 'pg'
gem 'sass'
gem 'dynamic_form'
4 . After creating a Gemfile, run gem install bundler
and then run bundle intall
to install all the dependencies in your rvm.
5 . Go into the lunchpicker directory and run rails new .
.
It’ll obviously overwrite your old configurations with Rails 3’s. Replace all the files.
6 . Do a diff and merge all your codes (controllers, models, js, css, haml etc.)
7 . Remove config/initializers/new_rails_defaults.rb
8 . Remove scaffold related html from the public
folder
9 . In your helpers, mark your HTML fragements as safe. For example you should change
return <<-HTML
<p>
blah
</p>
HTML
to
return <<-HTML.html_safe
<p>
blah
</p>
HTML
10 . If you have a non ActiveRecord model, and if you’re using Validatable, change it to Informal.
For Lunchpicker, search.rb is not an ActiveRecord model.
Also, we need to change
validates_presence_of :sheltered
to
validates_inclusion_of :sheltered, :in => [true, false]
We’re using validate inclusion rather than presence for booleans. This is due to the way Object#blank? handles boolean values. false.blank? # => true
11 . Add the following lines to config/application.rb. This will prefix the tablename infront of the PK. eg. venue_id
config.active_record.primary_key_prefix_type = :table_name_with_underscore
config.active_record.schema_format = :sql
12 . default.html.haml has been changed to application.html.haml in Rails 3.
Run git mv default.html.haml application.html.haml
and don’t forget to remove the application.html.erb as well.
13 . Switch rails.js from Prototype to jQuery. Remove controls.js, dragdrop.js, effect.js and prototype.js. Please see
rails.js.
14 . Add rails.js in the application.html.haml for Unobtrusive Javascript support.
= javascript_include_tag 'rails'
15 . If you run rake db:migrate
before step 11, run rake db:reset
and then rake db:migrate
again.
I hope that this will be useful to you if you need to migrate any old Rails 2 application to Rails 3.
I’ll use this opportunity to add unit test with RSpec.
The Rails 3 version of lunch picker won’t be in production till unit test is done!
]]>
Let’s say Class1 has a couple of arguments for it’s constructor and you do not want to provide each arguments, you can simply do:
1
|
|
then, you do
1
|
|
If you want to mock a certain method call in class1, do the following:
1 2 |
|
then, you do
1 2 |
|
Assuming that class2.callClass1Method()
will call
class1.methodReturnsString()
the assertion will be successful.
To get a better picture of what Mockito can do for you, you can checkout my Mockito Test project from my Github Sample Project.
I’ll assume that you have Git, Java 1.6 and Ant installed on your machine.
git clone https://github.com/pugnusferreus/mockito_test
cd mockito_test
ant
You can see that all the unit tests pass.
Thanks @cstrzadala for introducing Mockito to all of us!
]]>Sure, we could run a local db on our machine. What happen if it’s a huge RDBMS like Oracle or DB2? Yes, the previous sentence sounds enterprisey but as a Java developer, you can’t get away with enterprisey things.
Take a look at this sample project. Assuming that your current project is using Hibernate as your ORM, you can use HSQL DB to load up a “in memory” database for your unit test. Hibernate and hsql will automatically creates the table for you.
Here’s how. I’ll assume that you have Git, Java 1.6 (or whatever it’s called now) and Ant installed on your machine.
git clone https://github.com/pugnusferreus/dao_unit_test
cd dao_unit_test
ant
You can see that all the unit tests pass.
Make sure that your hibernate config points to the HSQL DB and your model is mapped. Open up “com.progriff.dao.UserDaoTest” and you can see that it’s your typical Java Unit Test
]]>