Using a Custom Bundle for the Liferay Workspace

Technical Blogs December 6, 2016 By David H Nebinger

So the Liferay workspace is pretty handy when it comes to building all of your OSGi modules, themes, layout templates and yes, even your legacy code from the plugins SDK.

But, when it comes to initializing a local bundle for deployment or building a dist bundle, using one of the canned Liferay 7 bundles from sourceforge may not cut it, especially if you're using Liferay DXP, or if you're behind a strict proxy or you just want a custom bundle with fixes already streamed in.

Looking at the file in the workspace it seems as though you can use a custom URL will work and, in fact, it does.  You can point to your custom URL and download your custom bundle.

But if you don't have a local HTTP server to host your bundle from, you can simulate having a custom bundle downloaded from a site w/o actually doing a download.

Building the Custom Bundle

So first you need to build a custom bundle.  This is actually quite easy.  It is simply a matter of downloading a current bundle, unzipping it, making whatever changes you want, then re-zipping it back up.

So you might, for example, download the DXP Service Pack 1 bundle, expand it, apply Fix Pack 8, then zip it all back up.  This will give you a custom bundle with the latest (currently) fix pack ready for your development environment.

One key here, though, is that the bundle must have a root directory in the zip.  Basically this means you shouldn't see, for example, /tomcat-8.0.32 as a folder in the root directory of the zip, it should always have some directory that contains everything, such as /dxp-sp1-fp8/tomcat-8.0.32, etc.

If you try to use your custom bundle and you get weird unzip errors like corrupt streams and such, that likely points to a missing directory at the root level w/ the bundle nestled underneath it.

Copy the Bundle to the Cache Folder

As it turns out, Liferay actually caches downloaded bundles in the ~/.liferay/bundles directory.

Copy your custom bundle zip to this directory to get it in the download cache.

Update the Download URL

Next you have to update your file to use your new URL.

You can use any URL you want, just make sure it ends with your bundle name:


Modify the Build Download Instructions

The final step is to modify the build.gradle file in the workspace root.  Add the following lines:

downloadBundle {
    onlyIfNewer false
    overwrite false

This basically has the download bundle skip the network check to see if the remote URL file exists and is newer than the local cached file.


Hey, that's it!  You're now ready to go with your local bundle.  Issue the "gradlew distBundleZip" command to build all of your modules, extract your custom bundle, feather in your deployment artifacts, and zip it all up into a ready-to-deploy  Liferay bundle.

Great for parties, entertaining, and even building a container-based deployment artifact for a cloud-based Liferay environment.


Debugging Liferay 7 In Intellij

Technical Blogs September 15, 2016 By David H Nebinger


So I'm doing more and more development using pure Intellij for Liferay 7 / DXP, even debugging.

I thought I'd share how I do it in case someone else is looking for a brief how-to.

Tomcat Setup

So I do not like running Tomcat within the IDE, it just feels wrong.  I'd rather have it run as a separate JVM with it's own memory settings, etc.  Besides I use external Tomcats to test deployments, run demos, etc., so I use them for development also.  The downside to this approach is that there is zero support for hot deploy; if you change code you have to do a build and deploy it for debugging to work.

Configuring Tomcat for debugging is really easy, but a quick script copy will make it even easier.

In your tomcat-8.0.32/bin directory, copy the startup script as debug.  On Windows that means copying startup.bat as debug.bat, on all others you copy as

Edit your new debug script with your favorite text editor and, practically the last line of the file you'll find the EXECUTABLE start line (will vary based upon your platform).

Right before the "start", insert the word "jpda" and save the file.

For Windows your line will read:

call "%EXECUTABLE%" jpda start %CMD_LINE_ARGS%

For all others your line will read:

exec "$PRGDIR"/"$EXECUTABLE" jpda start "$@"

This gives you a great little startup script that enables remote debugging.

Use your debug script to launch Tomcat.

Intellij Setup

We now need to set up a remote debugging configuration.  From the Run menu, choose Edit Configurations... option to open the Run/Debug Configurations dialog.

Click the + sign in the upper left corner to add a new configuration and choose Remote from the dropdown menu.

Add Configuration Dialog

Give the configuration a valid name and change the port number to 8000 (Tomcat defaults to 8000).  If debugging locally, keep localhost as the host; if debugging on a remote server, use the remote hostname.

Remote Tomcat Debugging Setup

Click on OK to save the new configuration.

To start a debug session, select Debug from the Run menu and select the debug configuration to launch.  The debug panel will be opened and the console should report it has connected to Tomcat.

Debugging Projects

If your project happens to be the matching Liferay source for the portal you're running, you should have all of the source available to start an actual debug session.  I'm usually in my own projects needing to understand what is going on in my code or the interaction of my code with the portal.

So when I'm ready to debug I have already built and deployed my module and I can set breakpoints in my code and start clicking around in the portal.

When one of your breakpoints is hit, Intellij will come to the front and the debugger will be front and center.  You can step through your code, set watches and view object, variable and parameter values.

Intellij will let you debug your code or any declared dependencies in your project.  But once you make a call to code that is not a dependency of your project, the debugger may lose visibility on where you actually are.

Fortunately there's an easy fix for this.  Choose Project Structure... from the File menu.  Select the Libraries item on the left.  The right side is where additional libraries can be added for debugging without affecting your actual project dependencies.

Click the + sign to add a new library.  Pick Java if you have a local source directory and/or jar file that you want to add or Maven if you want to download the dependency from the Maven repositories.  So, for example, you may want to add the portal-impl.jar file and link in the source directory to help debug against the core.  For the OSGi modules, you can add the individual jars or source dirs as you need them.

Add Libraries Dialog


So now you can debug Liferay/Tomcat remotely in Intellij.

Perhaps in a future blog I'll throw together a post about debugging Tomcat within Intellij instead of remotely...

Liferay 7 Development, Part 6

Technical Blogs September 14, 2016 By David H Nebinger


In part 5 we started the portlet code.  We added the configuration support, started the portlet and added the PanelApp implementation to get the portlet in the control panel.

In this part we will be adding the view layer into the portlet and add the action handlers.  To complete these parts we'll be layering in the use of our Filesystem Access DS component.

We're also going to take a quick look at Lexicon and what it means for us average portlet developers and implement our portlet using the new Liferay MVC framework.

MVC Implementation Details

The new Liferay MVC takes on a lot of the grunt work for portlet implementations, and in this new iteration we're actually building OSGi components that leverage annotations to get everything done.

So let's take a look at one of the ActionCommand components to see how they work.  The TouchFileFolderMVCActionCommand is one of the simpler action commands in our portlet so this one will allow us to look at the aspects of ActionCommands without getting bogged down in the implementation code.

 * class TouchFileFolderMVCActionCommand: Action command that handles the 'touch' of the file/folder.
 * @author dnebinger
	configurationPid = "com.liferay.filesystemaccess.portlet.config.FilesystemAccessPortletInstanceConfiguration",
	immediate = true,
	property = {
		"" + FilesystemAccessPortletKeys.FILESYSTEM_ACCESS,
	service = MVCActionCommand.class
public class TouchFileFolderMVCActionCommand extends BaseMVCActionCommand {

So here is a standard ActionCommand declaration.  The annotation identifies our configuration pid for accessing portlet instance config, the properties indicate this is an ActionCommand for our filesystem access portlet and the MVC command path to get to this action class, and the service declaration indicates this is an MVCActionCommand implementation.

The class itself extends the BaseMVCActionCommand so we don't have to implement all of the plumbing, just our necessary business logic.

	 * doProcessAction: Called to handle the touch file/folder action.
	 * @param actionRequest Request instance.
	 * @param actionResponse Response instance.
	 * @throws Exception in case of error.
	protected void doProcessAction(
		ActionRequest actionRequest, ActionResponse actionResponse)
		throws Exception {

And this is the method declaration that needs to be implemented as an extension of BaseMVCActionCommand.

		// get the config instance
		FilesystemAccessPortletInstanceConfiguration config = getConfiguration(

		if (config == null) {
			logger.warn("No config found.");

				actionRequest, MissingConfigurationException.class.getName());


		// Extract the target and current path from the action request params.
		String touchName = ParamUtil.getString(actionRequest, Constants.PARM_TARGET);
		String currentPath = ParamUtil.getString(actionRequest, Constants.PARM_CURRENT_PATH);

		// get the real target path to use for the service call
		String target = getLocalTargetPath(currentPath, touchName);

		// use the service to touch the item.
		_filesystemAccessService.touchFilesystemItem(config.rootPath(), target);

This next method demonstrates how an action command can get access to the portlet instance configuration.

	 * getConfiguration: Returns the configuration instance given the action request.
	 * @param request The request to get the config object from.
	 * @return FilesystemAccessPortletInstanceConfiguration The config instance.
	private FilesystemAccessPortletInstanceConfiguration getConfiguration(
		ActionRequest request) {

		// Get the theme display object from the request attributes
		ThemeDisplay themeDisplay = (ThemeDisplay)request.getAttribute(

		// get the current portlet instance
		PortletInstance instance = PortletInstance.fromPortletInstanceKey(

		FilesystemAccessPortletInstanceConfiguration config = null;

		// use the configuration provider to get the configuration instance
		try {
			config = _configurationProvider.getPortletInstanceConfiguration(
				themeDisplay.getLayout(), instance);
		} catch (ConfigurationException e) {
			logger.error("Error getting instance config.", e);

		return config;

	 * getLocalTargetPath: Returns the local target path.
	 * @param localPath The local path.
	 * @param target The target filename.
	 * @return String The local target path.
	private String getLocalTargetPath(final String localPath, final String target) {
		if (Validator.isNull(target)) {
			return null;

		if (Validator.isNull(localPath)) {
			return StringPool.SLASH + target;

		if (localPath.trim().endsWith(StringPool.SLASH)) {
			return localPath.trim() + target.trim();

		return localPath.trim() + StringPool.SLASH + target.trim();

Just as with our previous OSGi components, we rely on OSGi to inject services needed to implement the component functionality.

	 * setConfigurationProvider: Sets the configuration provider for config access.
	 * @param configurationProvider The config provider to use.
	protected void setConfigurationProvider(
		ConfigurationProvider configurationProvider) {

		_configurationProvider = configurationProvider;

	 * setFilesystemAccessService: Sets the filesystem access service instance to use.
	 * @param filesystemAccessService The filesystem access service instance.
	@Reference(unbind = "-")
	protected void setFilesystemAccessService(
		final FilesystemAccessService filesystemAccessService) {
		_filesystemAccessService = filesystemAccessService;

	private FilesystemAccessService _filesystemAccessService;

	private ConfigurationProvider _configurationProvider;

	private static final Log logger = LogFactoryUtil.getLog(

As I started flushing out the other ActionCommand implementations, I quickly found that I was copying and pasting the getConfiguration() and getLocalTargetPath() methods.  I refactored those into a base class, BaseActionCommand, and changed all of the ActionCommand implementations to extend this base class, so don't be alarmed when the source differs from the listing above.

Serving resources is handled in a similar fashion.  Below is the declaration for the FileDownloadMVCResourceComand, the component which will handle serving the file as a Serve Resource handler.

 * class FileDownloadMVCResourceCommand: A resource command class for returning files.
 * @author dnebinger
	immediate = true,
	property = {
		"" + FilesystemAccessPortletKeys.FILESYSTEM_ACCESS,
	service = MVCResourceCommand.class
public class FileDownloadMVCResourceCommand extends BaseMVCResourceCommand {

As with all Liferay MVC implementations, the view (render phase) is handled with JSP files.  JSP files do not have access to the OSGi injection mechanisms, so we have to use a different mechanism to get the OSGi injected resources and make them available to the JSP files.  We change the portlet class to handle this injection and pass through:

 * class FilesystemAccessPortlet: This portlet is used to provide filesystem
 * access.  Allows an administrator to grant access to users to access local
 * filesystem resources, useful in those cases where the user does not have
 * direct OS access.
 * This portlet will provide access to download, upload, view, 'touch' and
 * edit files.
 * @author dnebinger
	immediate = true,
	property = {
		"" + FilesystemAccessPortletKeys.FILESYSTEM_ACCESS,
		"javax.portlet.display-name=Filesystem Access",
	service = Portlet.class
public class FilesystemAccessPortlet extends MVCPortlet {

	 * render: Overrides the parent method to handle the injection of our
	 * service as a render request attribute so it is available to all of the
	 * jsp files.
	 * @param renderRequest The render request.
	 * @param renderResponse The render response.
	 * @throws IOException In case of error.
	 * @throws PortletException In case of error.
	public void render(RenderRequest renderRequest, RenderResponse renderResponse) throws IOException, PortletException {

		// set the service as a render request attribute
			Constants.ATTRIB_FILESYSTEM_SERVICE, _filesystemAccessService);

		// invoke super class method to let the normal render operation run.
		super.render(renderRequest, renderResponse);

	 * setFilesystemAccessService: Sets the filesystem access service instance to use.
	 * @param filesystemAccessService The filesystem access service instance.
	@Reference(unbind = "-")
	protected void setFilesystemAccessService(
		final FilesystemAccessService filesystemAccessService) {

		_filesystemAccessService = filesystemAccessService;

	private FilesystemAccessService _filesystemAccessService;

	private static final Log logger = LogFactoryUtil.getLog(

So here we let OSGi inject the references into the portlet instance class itself, and we override the render() method to pass our service references to the view layer as render request attributes.  In our init.jsp page, you'll find that the service reference instances are extracted from the render request attributes and turned into a variable that will be available to all JSP pages that include the init.jsp file.  In this way our JSPs have access to the injected services without having to go through the older Util classes to statically access the service reference.

So the only remarkable thing about the JSP files themselves is their location.  Instead of the old way of having the JSPs right next to the WEB-INF folder the way we used to build and deploy portlets, now they are actually built and shipped within the jar bundle by putting them into the resources/META-INF/resources directory; this is our new "root" path for all web assets.  So in this folder in our project we have all of our JSP files as well as a css folder with our css file.

Unfortunately I implemented the JSP files before Nate Cavinaugh's new blog entry,  Had I waited, I might have known that my use of AUI may have been a bad decision.  But then I remembered that the bulk of the portal is written using AUI and that unless the UI is completely rewritten all the way down to the least significant portlet, AUI is going to remain for some time to come.

Oh yeah, Lexicon

I mentioned that I was going to talk about Lexicon in the portlet.  Lexicon (and Metal and Soy and ...) are really hot topics for pure front end developers.  I'm looking for more of the discussions for cross-functional developers, the average developers that are doing front-end and back-end and are looking for a suitable path to navigate between both worlds without falling down the rabbit holes both sides offer from time to time.  AUI has historically been that path, a tag library that cross-functional developers could use to leverage Liferay's use of YUI javascript without really having to learn all of the details in using the AUI/YUI javascript library.

So to start that conversation, i'm going to talk about one of the design choices I made and how Lexicon was necessary as a result.

My need was fairly simple - I wanted to show an "add file" button that, when clicked, would show a dialog to collect the filename.  The dialog would have an OK button (that would trigger the creation of the file) and a cancel button (that cancelled the add of the file).

So I needed a modal dialog, but how was I going to implement that?  The choices for modal dialog implementation, as we all know, are seemingly endless, but does Lexicon offer a solution?

The answer is Yes, there is a Lexicon solution.  The doco can be found here:

Now if you're like me, you read this whole page and can see how the front end guys are just eating this up.  Use some standard tags, decorate with some attributes and voila, you have yourself a modal dialog on a page.  But you're left wondering how you're going to drop this in your portlet jsp page, how you're going to wire it up to trigger calls to your backend, etc., and you think back wistfully remembering the AUI tag library and how you could do some JSP tags to get a similar outcome...  Ah, the good ole days...

Oh, sorry, back on topic.  So I was able to leverage the Lexicon modal dialog in my JSP page and, with some javascript help, got everything working the way I needed.  What I'm about to show is likely not the best way to have integrated Lexicon, I'm sure Nate and his team would be able to shoot holes all through this code, but like I said I'm looking for the discussion that covers how cross-functional developers will use Lexicon and if this starts that discussion, then it's worth sharing.

So here goes, these are the parts of view.jsp which handles the add file modal dialog:

<button class="btn btn-default" data-target="#<portlet:namespace/>AddFileModal" data-toggle="modal" id="<portlet:namespace/>showAddFileBtn"><liferay-ui:message key="add-file" /></button>
<div aria-labelledby="<portlet:namespace/>AddFileModalLabel" class="fade in lex-modal modal" id="<portlet:namespace/>AddFileModal" role="dialog" tabindex="-1">
	<div class="modal-dialog modal-lg">
		<div class="modal-content">
			<portlet:actionURL name="/add_file_folder" var="addFileActionURL" />

			<div id="<portlet:namespace/>fm3">
				<form action="<%= addFileActionURL %>" id="<portlet:namespace/>form3" method="post" name="<portlet:namespace/>form3">
					<aui:input name="<%= ActionRequest.ACTION_NAME %>" type="hidden" />
					<aui:input name="redirect" type="hidden" value="<%= currentURL %>" />
					<aui:input name="currentPath" type="hidden" value="<%= currentPath %>" />
					<aui:input name="addType" type="hidden" value="addFile" />

					<div class="modal-header">
						<button aria-labelledby="Close" class="btn btn-default close" data-dismiss="modal" role="button" type="button">
							<svg aria-hidden="true" class="lexicon-icon lexicon-icon-times">
								<use xlink:href="<%= themeDisplay.getPathThemeImages() + "/lexicon/icons.svg" %>#times" />

						<button class="btn btn-default modal-primary-action-button visible-xs" type="button">
							<svg aria-hidden="true" class="lexicon-icon lexicon-icon-check">
								<use xlink:href="<%= themeDisplay.getPathThemeImages() + "/lexicon/icons.svg" %>#check" />

						<h4 class="modal-title" id="<portlet:namespace/>AddFileModalLabel"><liferay-ui:message key="add-file" /></h4>

					<div class="modal-body">
							<aui:input autoFocus="true" helpMessage="add-file-help" id="addNameFile" label="add-file" name="addName" type="text" />

					<div class="modal-footer">
						<button class="btn btn-default close-modal" id="<portlet:namespace/>addFileBtn" name="<portlet:namespace/>addFileBtn" type="button"><liferay-ui:message key="add-file" /></button>
						<button class="btn btn-link close-modal" data-dismiss="modal" type="button"><liferay-ui:message key="cancel" /></button>

So first I have the button that will trigger the display of the modal dialog.  The modal dialog is in the <div /> that follows.  The content div contains my AUI-based form but is decorated with appropriate Lexicon tags to add the dialog buttons.

There's also some javascript on the page that affects the dialog:

	var showAddFile ='#showAddFileBtn');

	if (showAddFile) {
		showAddFile.after('click', function() {
			var addNameText ='#addNameFile');
			if (addNameText) {

	var addFileBtnVar ='#addFileBtn');

	if (addFileBtnVar) {
		addFileBtnVar.after('click',function() {
			var fm ='#form3');
			if (fm) {

The first chunk is used to set focus on the name field after the dialog is displayed.  The second chunk triggers the submit of the form when the user clicks the okay button in the dialog.

The highlight of this code is that we get a modal dialog without really having to code any javascript, any JSP tags, etc.  We had to basically add necessary code to flush out the dialog content.

And I think that's really the essence of the Lexicon stuff; I think it's all going to work out to be a "standard" set of tags and attribute decorations that will render the UI, plus we'll have some code to put in at the JSP level to bind into our regular portlet code.


So here we are at the end of part 6 and I think I've covered it all...

We now have a finished project that satisfies all of our original requirements:

  • Must run on Liferay 7 CE GA2 (or newer).  Since we leveraged all Liferay 7 tools and are building module jars, we're definitely Liferay 7 compatible.
  • Must use Gradle for the build tool.  We set this up in part 2 of the blog series using the blade command line tool.
  • Must be full-on OSGi modules, no legacy stuff.  We also started this in part 2 and continued the module development through all other parts.
  • Must leverage Lexicon.  This was done in our modal dialog just introduced above.
  • Must leverage the new Configuration facilities.  The configuration facilities were added in part 5 as part of the initial portlet setup.
  • Must leverage the new Liferay MVC portlet framework.  The bulk of the Liferay MVC implementation was added in this blog part.
  • Must provide new Panel Application (side bar) support.  This was covered in part 5 of the blog series.

The original requirements have been satisfied, but how does it look?  Here's some screen shots to whet your appetite:

Main View

The main view lists files/folders that can be acted upon.  The path shows that I'm in /tomcat-8.0.32 but actually I'm off somewhere else in the filesystem.  Remember in a previous part I was using a "root path" to constrain filesystem access?  This shows that I am within a view sandbox that I cannot just sneak my way out of.  Even though we're exposing the underlying filesystem, we don't want to just throw out all semblances of security.

Add File Dialog

Our Lexicon-based modal dialog for adding a new file.

Upload File Dialog

The modal dialog even works for a file upload.

View File

The file view component is provided by the AUI ACE editor component.

Edit File

The edit file component is also provided by the ACE editor.

While not shown, the configuration panel allows the admin to set the "root path", enable/disable adds, uploads, downloads, deletes and edits.

That's pretty much it.  Hope you enjoyed the multi-part blog.  Feel free to comment below or, better yet, launch a discussion in the forums.

And remember, you can find the source code in github:

Liferay 7 Development, Part 5

Technical Blogs September 14, 2016 By David H Nebinger


In the first four parts we have introduced our project, laid out the Liferay workspace to create our modules, defined our DS service API and have just completed our DS service implementation.

It's now time to move on to starting our Filesystem Access Portlet.  With everything I want to do in this portlet, it's going to span multiple parts.  In this part we're going to start the portlet by tackling some key parts.

Note that this portlet is going to be a standard OSGi module, so we're not building a portlet war file or anything like that.  We're building an OSGi portlet module jar.


So configuration is probably an odd place to start but it is a key for portlet design.  This is basically going to define the configurable parts of our portlet.  We're defining fields we'll allow an administrator to set and use that to drive the rest of the portlet.  Personally I've always found it easier to build in the necessary flexibility up front rather than getting down the road on the portlet development and try to retrofit it in later on.

For example, one of our configurable items is what I'm calling the "root path".  The root path is a fixed filesystem path that constrains where the users of the portlet can access.  And this constraint is enforced at all levels, it forms a layer of protection to ensure folks are not creating/editing files outside of this root path.  By starting with this as a configuration point, the rest of the development has to take this into account.  And we've seen this already in the DS API and service presented in the previous parts - every method in the API has the rootPath as the first argument (yes I had my configation parts figured out before I started any of the development).

So let's review our configuration elements:

Item Description
rootPath This is the root path that constrains all filesystem access.
showPermissions This is a flag whether to show permissions or not.  On windows systems, permissions don't really work so this flag can remove the non-functional permissions column.
deletesAllowed This is a flag that determines whether files/folders can be deleted or not.
uploadsAllowed This is a flag that determines whether file uploads are allowed.
downloadsAllowed This is a flag that determines whether file downloads are allowed.
editsAllowed This is a flag that determines whether inline editing is allowed.
addsAllowed This is a flag that determines whether file/folder additions are allowed.
viewSizeLimit This is a size limit that determines whether a file can be viewed in the browser.  This can impose an upper limit on generated HTML fragment size.
downloadableFolderSizeLimit This defines the size limit for downloading folders.  Since folders will be zipped live out of the filesystem, this can be used to ensure server resources are not overwhelmed creating a large zip stream in memory.
downloadableFolderItemLimit This defines the file count limit for downloadable folders.  This too is a mechanism to define an upper limit for server resource consumption.

Seeing this list and understanding how it will affect the interface, it should be pretty clear it's going to be much easier building that into the UI from the start rather than trying to retrofit it in later.

In previous versions of Liferay we would likely be using portlet preferences for these options, but since we're building for Liferay 7 we're going to take advantage of the new Configuration support.

We're going to start by creating a new package in our portlet, com.liferay.filesystemaccess.portlet.config (current Liferay practice is to put the configuration classes into a config package in your portlet project).

There are a bunch of classes that will be used for configuration, let's start with the central one, the configuration definition class FilesystemAccessPortletInstanceConfiguration:

 * class FilesystemAccessPortletInstanceConfiguration: Instance configuration for
 * the portlet configuration.
 * @author dnebinger
	category = "platform",
	scope = ExtendedObjectClassDefinition.Scope.PORTLET_INSTANCE
	localization = "content/Language",
	name = "",
	id = "com.liferay.filesystemaccess.portlet.config.FilesystemAccessPortletInstanceConfiguration"
public interface FilesystemAccessPortletInstanceConfiguration {

	 * rootPath: This is the root path that constrains all filesystem access.
	@Meta.AD(deflt = "${LIFERAY_HOME}", required = false)
	public String rootPath();

	// snip

There's a lot of stuff here, so let's dig in...

The @Meta annotations are from BND and define meta info on the class and the members.  The OCD annotation on the class defines the name of the configuration (using the portlet language bundle) and the ID for the configuration.  The ID is critical and is referenced elsewhere and must be unique across the portal, so the full class name is the current standard.  The AD annotation is used to define information about the individual fields.  We're defining the default values for the parameters and indicating that they are not required (since we have a default).

The @ExtendedObjectClassDefinition is used to define the section of the System Settings configuration panel.  The category (language bundle key) defines the major category the settings will be set from, and the scope defines whether the config is per portlet instance, per group, per company or system-wide.  We're going to use portlet instance scope so different instances can have their own configuration.

The next class is the FilesystemAccessPortletInstanceConfigurationAction class, the class that handles submits when the configuration is changed.  Instead of showing the whole class, I'm only going to show parts of the file that need some discussion.  The whole class is in the project in Github.

 * class FilesystemAccessConfigurationAction: Configuration action for the filesystem access portlet.
 * @author dnebinger
	immediate = true,
	property = {
		"" + FilesystemAccessPortletKeys.FILESYSTEM_ACCESS
	service = ConfigurationAction.class
public class FilesystemAccessPortletInstanceConfigurationAction
	extends DefaultConfigurationAction {

	 * getJspPath: Return the path to our configuration jsp file.
	 * @param request The servlet request.
	 * @return String The path
	public String getJspPath(HttpServletRequest request) {
		return "/configuration.jsp";

	 * processAction: This is used to process the configuration form submission.
	 * @param portletConfig The portlet configuration.
	 * @param actionRequest The action request.
	 * @param actionResponse The action response.
	 * @throws Exception in case of error.
	public void processAction(
		PortletConfig portletConfig, ActionRequest actionRequest,
		ActionResponse actionResponse)
		throws Exception {

		// snip

	 * setServletContext: Sets the servlet context, use your portlet's bnd.bnd Bundle-SymbolicName value.
	 * @param servletContext The servlet context to use.
		target = "(osgi.web.symbolicname=com.liferay.filesystemaccess.web)", unbind = "-"
	public void setServletContext(ServletContext servletContext) {

So the configuration action handler is actually a DS service.  It's using the @Component annotation and is implementing the ConfigurationAction service.  The parameter is the portlet name (so portlets map the correct configuration action handler).

The class returns it's own path to the JSP file used to show the configuration options.  The path returned is relative to the portlet's web root.

The processAction() method is used to process the values from the configuration form submit.  When you review the code you'll see it is extracting parameter values and saving preference values.

The class uses an OSGi injection using @Reference to inject the servlet context for the portlet.  The important part to note here is that the value must match the Bundle-SymbolicName value from the project's bnd.bnd file.

There are three other source files in this package that I'll describe briefly...

The FilesystemAccessDisplayContext class is a wrapper class to provide access to the configuration instance object in different portlet phases (i.e. Action or Render phases).  In some phases the regular PortletDisplay instance (a new object availble from the ThemeDisplay) can be used to get the instance config object, but in the Action phase the ThemeDisplay is not fully populated so this access fails.  The FilesystemAccessDisplayContext class provides access in all phases.

The FilesystemAccessPortletInstanceConfigurationBeanDeclaration class is a simple component to return the FilesystemAccessPortletInstanceConfiguration class so a configuration instance can be created on demand for new instances.

The FilesystemAccessPortletInstanceConfigurationPidMapping class maps the configuration class (FilesystemAccessPortletInstanceConfiguration) with the portlet id to again support dynamic creation and tracking of configuration instances.

The Portlet Class

Portlet classes are much smaller than what they used to be under Liferay MVC.  Here is the complete portlet class:

 * class FilesystemAccessPortlet: This portlet is used to provide filesystem
 * access.  Allows an administrator to grant access to users to access local
 * filesystem resources, useful in those cases where the user does not have
 * direct OS access.
 * This portlet will provide access to download, upload, view, 'touch' and
 * edit files.
 * @author dnebinger
	immediate = true,
	property = {
		"javax.portlet.display-name=Filesystem Access",
	service = Portlet.class
public class FilesystemAccessPortlet extends MVCPortlet {

It has no body at all.  Can't get any simpler than that...

There is no longer a portlet.xml file or a liferay-portlet.xml file.  Instead, these values are all provided through the properties on the DS component for the portlet.

The Panel App

Our portlet is a regular portlet that admins will be able to drop on any page they want.  However, we're also going to install the portlet as a Panel App, the new way to create a control panel.  We'll do this using the FilesystemAccessPanelApp class:

 * class FilesystemAccessPanelApp: Component which exposes our portlet as a control panel app.
 * @author dnebinger
	immediate = true,
	property = {
		"panel.category.key=" + PanelCategoryKeys.CONTROL_PANEL_CONFIGURATION
	service = PanelApp.class
public class FilesystemAccessPanelApp extends BasePanelApp {

	 * getPortletId: Returns the portlet id that will be in the control panel.
	 * @return String The portlet id.
	public String getPortletId() {
		return FilesystemAccessPortletKeys.FILESYSTEM_ACCESS;

	 * setPortlet: Injects the portlet into the base class, uses the actual portlet name for the lookup which
	 * also matches the value set in the portlet class annotation properties.
	 * @param portlet
		target = "(" + FilesystemAccessPortletKeys.FILESYSTEM_ACCESS + ")",
		unbind = "-"
	public void setPortlet(Portlet portlet) {


The @Component annotation shows this is yet another DS service that implements the PanelApp class.  The panel.category.key value will put our portlet under the configuration section of the control panel and the high property will put our portlet near the bottom of the list.

The methods specified will ensure the base class has the Filesystem Access portlet references for the panel app to work.

The JSPs

We will update the init.jsp and add the configuration.jsp files.  Not much to see, pretty generic jsp implementations.  The init.jsp file pulls in all of the includes used in the other jsp files and copies the config into local member fields.  The configuration jsp file has the AUI form for all of the configuration elements.

Configure The Module

Our portlet is still in the process of being flushed out, but we'll wrap up this part of the blog by configuring and deploying the module in it's current form.

Edit the bundle file so it contains the following:

Bundle-Name: Filesystem Access Portlet
Bundle-SymbolicName: com.liferay.filesystemaccess.web

Bundle-Version: 1.0.0


Private-Package: com.liferay.filesystemaccess.portlet
Web-ContextPath: /filesystem-access

-metatype: *

So here we're forcing the import of the portlet and servlet APIs, these ensure that dependencies of our dependencies are included.

We also declare that our portlet classes are all private.  This means that other folks will not be able to use us as a dependency and include our classes.

An important addition is the Web-ContextPath key.  This value is used while the portlet is running to define the context path to portlet resources.  Given the value above, the portal will make our resources available as /o/filesystem-access/..., so for example you could go to /o/filesystem-access/css/main.css to pull the static css file if necessary.


Well the modifications are all done.  At a command prompt, go to the modules/apps/filesystem-access-web directory and execute the following command:

$ ../../../gradlew build

You should end up with your new com.liferay.filesystem.access.web-1.0.0.jar bundle file in the build/libs directory.  If you have a Liferay 7 CE or Liferay DXP tomcat environment running, you can drop the jar into the Liferay deploy folder.

Drop into the Gogo Shell and you can even verify that the module has started:

Welcome to Apache Felix Gogo

g! lb | grep Filesystem
  487|Active     |   10|Filesystem Access Service (1.0.0)
  488|Active     |   10|Filesystem Access API (1.0.0)
  489|Active     |   10|Filesystem Access Portlet (1.0.0)

If they are all active, you're in good shape.

Viewing in the Portal

For the first time in this blog series, we actually have something we can add in the portal.  Log in as an administrator to your portal instance and go to the add panel.  Under the Applications section you should find the System Administration group and in there is our new Filesystem Access portlet.  Grab it and drop it on the page.  You should see it render in the page.

So we haven't really done anything to the main view, but let's test what we did add.  First go to the ... menu and choose the Configuration element.  Although it probably isn't pretty, you should see your configuration panel there.  You can change values, click save, exit and come back in to see that they stick, ...  Basically the configuration should be working.

Next pull up the left panel and go to the Control Panel to the Configuration section.  You should see the Filesystem Access portlet at the bottom of the list (well, position depends upon whatever else you have installed, but in a clean bundle it will be at the bottom).  You can click on the option and you'll get your portlet again, but just the welcome message.  Not very impressive, but we'll get there.

You can also go to the System Settings control panel and you'll see a Platform tab at the top.  When you click on Platform, you should see Filesystem Access.  Click on it for the default configuration settings (used as defaults for new portlet instances).  This configuration will look different than your configuration.jsp because it's not using your configuration, it's a version of the form generated using just the FilesystemAccessPortletInstanceConfiguration class and the information in the Meta annotations to create the form.

Another cool thing you can try, create a com.liferay.filesystemaccess.portlet.config.FilesystemAccessPortletInstanceConfiguration.cfg file in the $LIFERAY_HOME/osgi/modules directory and you can define the default configuration values there.  The values in this file override the defaults from the @Meta annotations and allow you to use a deployable config you can use.


Well, our portlet is not done yet.  We have a good start on it, but we'll have yet more to add in the next part.  Remember the code for the project is up on Github and the code for this part is in the dev-part-5 branch.

In the next part we'll pick up on the portlet development and start building the real views.

Liferay 7 Development, Part 4

Technical Blogs September 14, 2016 By David H Nebinger


In part 3 of the blog, the API for the Filesystem Access project was flushed out.

In this part of the blog, we'll create the service implementation module.

The Data Transfer Object

Now we get to implement our DTO object.  The interface is pretty simple, we just have to add all of the methods and expose values from data we retain.

The code will be available on GitHub so I won't go into great detail here.  Suffice it to say that for the most part it is exposing values from the underlying File object from the filesystem.

The Service

The service implementation is just as straight forward as the DTO.  It leverages the* packages and apis and also uses com.liferay.portal.kernel.util.FileUtil for some supporting functions.  It also integrates the use of the Liferay auditing mechanism to issue relevant audit messages.

The Annotations

It's really the annotations which will make the FilesystemAccessServiceImpl into a true DS service.

The first annotation is the Component annotation and it is used such as:

import org.osgi.service.component.annotations.Component;
import org.osgi.service.component.annotations.Reference;

 * class FilesystemAccessServiceImpl: Implementation class of the FilesystemAccessService.
 * @author dnebinger
@Component(service = FilesystemAccessService.class)
public class FilesystemAccessServiceImpl implements FilesystemAccessService {

Here we are declaring that we have a DS component implementation that provides the FilesystemAccessService.  And boom, we're done.  We have just implemented a DS service.  Couldn't be easier, could it?

The other annotation import there is the Reference annotation.  We're going to be showing how to use the DS service in the next part of the blog, but we'll inject a preview here.

In the discussion of the service in the previous section, we mentioned that we were going to be integrating the Liferay audit mechanism so different filesystem access methods would be audited.  To integrate the audit mechanism, we need an instance of Liferay's AuditRouter service.  But how do we get the reference?  Well, AuditRouter is also implemented as a DS service.  We can have OSGi inject the service when our module is started using the Reference annotation such as:

 * _auditRouter: We need a reference to the audit router
private AuditRouter _auditRouter;

And boom, we're done there too.

OSGi is handling all of the heavy lifting for us.  One annotation exposes our class as a service component implementation and another annotation will inject services which we have a dependency on.

Configure The Module

So we're not quite finished yet.  We have the code done, but we should configure our bundle so we get the right outcome.

Edit the bundle file so it contains the following:

Bundle-Name: Filesystem Access Service
Bundle-SymbolicName: com.liferay.filesystem.access.svc
Bundle-Version: 1.0.0
-sources: true

The Liferay standard is to give a meaningful name for the bundle, but the symbolic name should be something akin to the project package.  You definitely want the symbolic name to be unique when it is deployed to the OSGi container.


Well the modifications are all done.  At a command prompt, go to the modules/apps/filesystem-access-svc directory and execute the following command:

$ ../../../gradlew build

You should end up with your new com.liferay.filesystem.access.svc-1.0.0.jar bundle file in the build/libs directory.  If you have a Liferay 7 CE or Liferay DXP tomcat environment running, you can drop the jar into the Liferay deploy folder.

Drop into the Gogo Shell and you can even verify that the module has started:

Welcome to Apache Felix Gogo

g! lb | grep Filesystem
  486|Active     |   10|Filesystem Access API (1.0.0)
  487|Active     |   10|Filesystem Access Service (1.0.0)

So we see that our module deployed and started correctly, so all is good.  So we have a good outcome because both of our modules have been deployed and started successfully.

When things go awry.  If you've only deployed the service module and not the api module, you'll see something like:

Welcome to Apache Felix Gogo

g! lb | grep Filesystem
  487|Installed  |   10|Filesystem Access Service (1.0.0)

This is really a bad outcome.  Installed simply means it's there in the OSGi container, but it won't be available.  We can try to start the module:

g! start 487
org.osgi.framework.BundleException: Could not resolve module: com.liferay.filesystem.access.svc [487]
  Unresolved requirement: Import-Package: com.liferay.filesystemaccess.api; version="[1.0.0,2.0.0)"

This is how you see why your module isn't started.  These are the kinds of messages you'll see most often, some sort of missing dependency that you'll have to satisfy before your module will start.  The worst part is that you won't see these errors in the logs either.  From the logs perspective you won't see a "STARTED" message for your module, but as we all know the lack of a message is not really a visible error that is easily resolved.

This current error is easy to fix - just deploy the api module.  As soon as it is started, OSGi will auto-start the service module, you won't have to start it yourself.  You can check that both modules are running by listing the beans again and you'll see they are both now marked as Active.


So we have our DS service completed.  The API module defining the service is out there and active.  We have an implementation of the service also deployed and active, too.  If there was a need for it, we could build alternative implementations of the DS service and deploy those also, but that's a discussion that should be saved for perhaps a different blog entry.

So we have our DS service, we just don't have a service client yet.  Well that's coming in the next part of the blog.  We're actually going to start building the filesystem access portlet and layer in our client code.  See you there...


Liferay 7 Development, Part 3

Technical Blogs September 14, 2016 By David H Nebinger


In part 2 of the series we created our initial project modules for the Filesystem Access Portlet project.

In part 3, we're going to move on to flushing out the DS service that will provide a service layer for our portlet.

Admittedly this service layer is contrived; while building out the initial version of this portlet there was no separate service layer, no external modules, everything was implemented directly in the portlet itself.  The portlet worked just fine.

But that doesn't make a great example of how to build things for Liferay 7 so I refactored out a service tier.

The Service Tier

No, we're not building with ServiceBuilder (even though it is still supported under Liferay 7) as our service layer has no database component.

No, we're not building a service tier using the Fake Entity technique for ServiceBuilder, that's not really necessary anymore in the new Liferay 7 OSGi world.

We're building a straight DS service.  In OSGi, DS is an acronym for Declarative Services.  I don't want to do a deep dive into what DS is (especially since there are already some great ones out there like, but suffice it to say we're building out a module to define a service (the filesystem-access-api module) along with an implementation module (the filesystem-access-svc module) and we're also going to have a service consumer (the filesystem-access-web module). DS is the runtime glue that will bring these three modules together.

With DS we get new capabilities in Liferay to define independent services and apis that are not database related.  So you get the chance to build out your own entities, build services to operate on those entities and modularize your code to separate these concerns.  And DS also supports wiring of components at runtime without having to implement tedious ServiceTrackers or other more complex constructs.

The Data Transfer Object

So our DTO is actually going to be quite simple.  We need a container for some filesystem data so we can pass complex data around.

The container poses a question - should we define our container as a class or as an interface.  Either works, but my own recommendation is:

  1. If the container can be a simple POJO w/o any business logic, then use a class.
  2. If the container includes business logic, then use an interface.

When you're just passing data around, a POJO class is a great container.  But as soon as you start building out some methods that have business logic code, you're better off using an interface if only to hide the implementation details from your service API consumers.

Because I've already implemented the code, I know that a simple POJO container is not going to work for what I'll be implementing, so we're going to use an interface for our container item.

In the filesystem-access-api module, add a com.liferay.filesystemaccess.api.FilesystemItem interface.  I'm not going to list all of the methods here, but basically it's going to look something like:

 * class FilesystemItem: This is a container for an individual filesystem item.
 * @author dnebinger
public interface FilesystemItem extends Serializable {

	 * getAbsolutePath: Returns the filesystem absolute path to the item.
	 * @return String The absolute path.
	public String getAbsolutePath();

	 * getData: Returns the byte data from the file.
	 * @return byte array of file data.
	 * @throws IOException in case of read failure.
	public byte[] getData() throws IOException;

	 * getDataStream: Returns an InputStream for the file data.
	 * @return InputStream The data input stream.
	 * @throws FileNotFoundException in case of read error.
	public InputStream getDataStream() throws FileNotFoundException;


Notice how we can return complex objects, unlike how we are limited in ServiceBuilder code.  Pretty cool, huh?

So remember how I want to use an interface if there is business logic in play?  The getDataStream() method will have business logic in it for returning an input stream for the file, so an interface is the best route.

The Service

The service is going to be a basic interface listing out all of our methods.  We'll add these to the com.liferay.filesystem.api.FilesystemAccessService class that we created in part 2.  The interface we'll be implementing is:

 * class FilesystemAccessService: The service interface for the filesystem
 * access.
 * NOTE: The methods below refer to the *root path*.  The portlet will allow
 * an admin to constrain access to a given, fixed path.  This is the root path.
 * Whatever the user attempts, they will always be constrained within the root
 * path.
 * @author dnebinger
public interface FilesystemAccessService {

	 * countFilesystemItems: Counts the items at the given path.
	 * @param rootPath The root path.
	 * @param localPath The local path to the directory.
	 * @return int The count of items at the given path.
	public int countFilesystemItems(
		final String rootPath, final String localPath);

	 * createFilesystemItem: Creates a new filesystem item at the given path.
	 * @param rootPath The root path.
	 * @param localPath The local path for the new item.
	 * @param name The name for the new item.
	 * @param directory Flag indicating create a directory or a file.
	 * @return FilesystemItem The new item.
	 * @throws PortalException in case of error.
	public FilesystemItem createFilesystemItem(
			final String rootPath, final String localPath, final String name,
			final boolean directory)
		throws PortalException;

	 * deleteFilesystemItem: Deletes the target item.
	 * @param rootPath String The root path.
	 * @param localPath The local path to the item to delete.
	 * @throws PortalException In case of delete error.
	public void deleteFilesystemItem(
			final String rootPath, final String localPath)
		throws PortalException;

	 * getFilesystemItem: Returns the FilesystemItem at the given path.
	 * @param rootPath The root path.
	 * @param localPath The local path to the item.
	 * @param timeZone The time zone for the access.
	 * @return FilesystemItem The found item.
	public FilesystemItem getFilesystemItem(
		final String rootPath, final String localPath,
		final TimeZone timeZone);

	 * getFilesystemItems: Retrieves the list of items at the given path,
	 * includes support for scrolling through the items.
	 * @param rootPath The root path.
	 * @param localPath The local path to the directory.
	 * @param startIdx The starting index of items to return.
	 * @param endIdx The ending index of items to return.
	 * @param timeZone The time zone for the access.
	 * @return List The list of FilesystemItems.
	public List getFilesystemItems(
		final String rootPath, final String localPath, final int startIdx,
		final int endIdx, final TimeZone timeZone);

	 * touchFilesystemItem: Touches the target item, updating the modified
	 * timestamp.
	 * @param rootPath The root path.
	 * @param localPath The local path to the item.
	public void touchFilesystemItem(
		final String rootPath, final String localPath);

	 * updateContent: Updates the file content.
	 * @param rootPath The root path.
	 * @param localPath The local path to the item to update.
	 * @param content String The new content to write.
	 * @throws PortalException In case of write error.
	public void updateContent(
			final String rootPath, final String localPath, final String content)
		throws PortalException;

	 * updateDownloadable: Updates the downloadable flag for the item.  For
	 * directories, they will be downloadable if the total bytes to zip is less
	 * than the given value and contains fiewer than the given number of files.
	 * These limits allow the administrator to prevent building zip files of
	 * downloads of a directory if building the zip file would overwhelm
	 * available resources.
	 * @param filesystemItem The item to update the downloadable flag on.
	 * @param maxUnzippedSize The max number of bytes to allow for download,
	 *                        for a directory it's the max total unzipped bytes.
	 * @param maxZipFiles The max number of files which can be zipped when
	 *                    checking download of a directory.
	public void updateDownloadable(
		final FilesystemItem filesystemItem, final long maxUnzippedSize,
		final int maxZipFiles);

	 * uploadFile: This method handles the upload and store of a file.
	 * @param rootPath The root path.
	 * @param localPath The local path where the file will go.
	 * @param filename The filename for the file.
	 * @param target The uploaded, temporary file.
	 * @param timeZone The user's time zone for modification date display.
	 * @return FilesystemItem The newly loaded filesystem item.
	 * @throws PortalException in case of error.
	public FilesystemItem uploadFile(
		final String rootPath, final String localPath, final String filename,
		final File target, final TimeZone timeZone) throws PortalException;

So this interface allows us to get file items, create and delete them, update the contents, ...  Pretty much everything we need to support for the filesystem access portlet.

Additional Sources

There are some additional files in the com.liferay.filesystemaccess.api package.  These include some exception classes and a constant class.

There is also a com.liferay.filesystemaccess.audit package that contains some classes used for integrating auditing into the portlet.  Since we're exposing the underlying filesystem, we're going to add audit support so we can track who does what.  The classes in this package will help facilitate the auditing.

Configure The Module

So we're not quite finished yet.  We have the code done, but we should configure our bundle so we get the right outcome.

Edit the bundle file so it contains the following:

Bundle-Name: Filesystem Access API
Bundle-SymbolicName: com.liferay.filesystem.access.api
Bundle-Version: 1.0.0
-sources: true

The Liferay standard is to give a meaningful name for the bundle, but the symbolic name should be something akin to the project package.  You definitely want the symbolic name to be unique when it is deployed to the OSGi container.

Since we have created the API, we'll declare the export package(s) for classes that others may consume.


Well the modifications are all done.  At a command prompt, go to the modules/apps/filesystem-access-api directory and execute the following command:

$ ../../../gradlew build

You should end up with your new com.liferay.filesystem.access.api-1.0.0.jar bundle file in the build/libs directory.  If you have a Liferay 7 CE or Liferay DXP tomcat environment running, you can drop the jar into the Liferay deploy folder.

Drop into the Gogo Shell and you can even verify that the module has started:

Welcome to Apache Felix Gogo

g! lb | grep Filesystem
  486|Active     |   10|Filesystem Access API (1.0.0)

So we see that our module deployed and started correctly, so all is good.

In the next part of the blog, we'll go about implementing the service module.

Liferay 7 Development, Part 2

Technical Blogs September 13, 2016 By David H Nebinger


In part 1, the filesystem access portlet project for Liferay 7 was introduced.

In this part, we're going to use Blade and Gradle to set up our basic project structure.

We're not going to do much in the way of code in this part, it's going to be enough that we get all of our modules defined and ready for development in future parts of the blog.

The Tools - Gradle

So if you've been in Liferay development for a while, I'm sure you know the history of Liferay's build tool selection, but we'll review anyway.

First there was Ant.  Of course, when Liferay started using Ant, everybody was using Ant.  It was really the only game in town.  And with Ant came the Liferay SDK and it's rigid structure for where projects had to be and how they had to be structured.

And that was it, for a long time, much longer than necessary.  The rest of the world had moved on to Maven, discarding the verbose and declarative nature of the Ant build.xml file, but Liferay stuck with Ant and the SDK...

With Liferay 6.1, however, Liferay started to catch up and the Liferay Maven artifacts started coming out.  It was rough at first, but by the time we were on to 6.2 Maven support was all there and we were finally able to discard the rigid SDK approach and free our projects from holding all of the dependency jars.

But the development world kept moving on, discarding Maven for cooler build tools with awesome features and flexibility that Maven couldn't offer.

Instead of slowly integrating a new tool in, Liferay went all-in with the introduction of Gradle support in Liferay 7.

Liferay is adopting Gradle for its projects (modularization of the core, side projects, etc.) so it's a really safe bet that Gradle support is going to be around for a while.

Me, I'm much more pragmatic; if Liferay is going to use this tool for their projects, I want to use the tool too; I don't want to be the odd ball looking for support on something they may not be interested in supporting some time in the future.

So Gradle it shall be.  Go ahead and download and install Gradle on your system; I'm not going to go into details here, but you really shouldn't have any problems installing it and their documentation and other documentation online will be able to help you get the job done.

The Gradle Wrapper

I want to take a moment and mention something you'll often see in Gradle projects, the gradlew[.bat] files and the gradle directory.  These guys are the Gradle Wrapper.

The wrapper is basically a fixed version of Gradle that is installed and run locally within the project directory.  Why, when you can install Gradle, would you want to do such a thing?  Well, if you're passing a project to a developer that doesn't have Gradle installed (like you when you pull down Liferay 7 projects for the first time), you may not have Gradle installed; the wrapper allows you to build the project anyway.  It's also going to stick to the version of Gradle used to install the wrapper; this removes version-specific issues from the build process since the right version of Gradle will be with the project regardless of what version the developer has.

If you're not creating projects, you probably won't need to set up the Gradle Wrapper.  If you're creating a project to share with others, when you initialize the Gradle project you'll get the gradle wrapper in there too.

After the wrapper is generated, you're going to want to use the wrapper from that point forward.  Instead of issuing the command "gradle build" to do a build, instead you'd use the wrapper version, "gradlew build"

The Tools - Blade

Okay, some explanation here before we get too far in; there's like two Liferay projects which you may find referred to as Blade.

The one we're going to use, the Blade CLI, well we'll get to that one in just a moment.

The other one is the Blade Samples project,  The Blade Samples project is a big repository of Liferay 7 modules, a set of 31 (currently), and all of these provided using different build tools (BND, Gradle, Liferay's Gradle plugin, and Maven).  You can use to, say, master some of the OSGi module stuff using your favorite build tool and, once you understand that, can review the implementation using a different build tool so you could see how to translate what you know to the new build tool.  All of that said, we won't really come back to the Blade Samples again, but I did want to highlight it as it can be a valuable resource for you in the future.

Back to the Blade CLI.  This command line tool is a utility to help scaffold together your gradle-based Liferay modules.  It can generate the foundation for you so you can jump right into flushing out your code.

If you haven't already done so, install the Blade CLI per the instructions on the Blade website.  If you have already installed it, you should update it using the command:

[sudo] blade update

Note you only need the sudo portion if you are on a system that has a sudo command; if you don't know if you do have this command, it is likely that you do not have it.

The Project Structure

One of our requirements is to build full OSGi-compliant Liferay 7 modules.  That's a mouth full, but it draws out an important point.

OSGi-compliant modules means we're going to be building jars.  OSGi bundles, actually, but in Liferay we call them modules because, well, they actually will include some Liferay specific stuff and have some Liferay runtime container dependencies.  Liferay calls these guys modules, so we will too.  When someone says bundle, just know they're talking about modules.

Now if you're not familiar w/ what that really means, well then it's time to start learning about them.  If I had to boil it down to something simple, I guess I'd say it's pretty close in concept to the Spring Framework - small pieces of functionality that are wired together at runtime to build a functional application.  Obviously it is a lot bigger than that with multiple classloaders, a slew of runtime dynamic configuration support, etc.

But thinking along the lines of Spring we can come up with a pretty good architecture for our portlet implementation.

Now we could just put all the code into one bundle and everything would work, it would be simple, and it would get the job done.  But if we're really going to learn something here, let's over-architect it as it will make a better educational experience.

So, thinking along the Spring lines, we know we're going to have some sort of "service" that we'll use to hold our file system access business logic.  Like in Spring, we'll separate that into an API module (the Spring interfaces) and the Service module (the concrete implementation classes).  And yes, we still have our portlet to create, so that will be a module too.

To get back on task, our project will consist of one set of parent build files and 3 modules.

Building The Project Structure

Okay, we have the background for the tools behind us, time to build out the structure, so let's get to it.

So first you need a directory somewhere where you're going to be creating everything.  I'm going to create the "filesystem-access" folder in my local projects directory; every other step below and in the other parts of this blog will all be taking place within this folder.

Step 1 - Set Up Workspace

The first thing we need to do is initialize the Blade/Gradle environment in our project directory:

$ blade init
Note that this command will fail if there are files in the directory, so you want to do this before any other steps.

The structure that you get here is actually referred to as the "Liferay Workspace".  The Liferay Workspace is a container for building all of your plugins.  It's kind of like the old SDK except it is based on Gradle (not Ant) and only distinguishes between modules and themes (unlike all of the different plugin types supported by the SDK).

We'll be using this workspace as our foundation for the project.  For a project that will be multiple modules, the Liferay Workspace is a good foundation.

Step 2 - Create The Modules

We will be switching to the Blade CLI to create out modules.  Change to the modules/apps directory (you may need to create the apps directory in modules) and use the following commands will be executed to create the modules:

$ blade create -t api -p com.liferay.filesystemaccess filesystem-access-api
$ blade create -t api -p com.liferay.filesystemaccess.svc filesystem-access-svc
$ blade create -t mvcportlet -p com.liferay.filesystemaccess -c FilesystemAccessPortlet filesystem-access-web

Step 3 - Clean Up The Modules

Since we created two projects from the api template, we're going to take some time to undo that effort.

  • In the filesystem-access-api module, add a blank interface, com.liferay.filesystemaccess.api.FilesystemAccessService (we'll fill it out in the next blog).
  • In the filesystem-access-api module, delete the com.liferay.filesystemaccess.api.FilesystemAccessApiActivator class.
  • In the filesystem-access-svc module, change the package from com.liferay.filesystemaccess.svc.api to com.liferay.filesystemaccess.svc.internal.
  • In the filesystem-access-svc module, add a blank class, com.liferay.filesystemaccess.svc.internal.FilesystemAccessServiceImpl (we'll fill it out in the next blog).
  • In the filesystem-access-svc module, delete the com.liferay.filesystemaccess.svc.FilesystemAccessServiceActivator class.


Okay, so far we haven't done very much, but let's review anyway:

  • We reviewed a little about Gradle and Blade.
  • We created a Liferay Workspace to hold our modules.
  • We created our initial yet empty modules.

In the next blog post we'll get into some real development stuff, building out a DS service layer...

The project has been pushed up to GitHub, you can access it here:

The files for this part of the blog are in branch part-2:

Liferay 7 Development, Part 1

Technical Blogs September 13, 2016 By David H Nebinger


So I've been doing some LR7 development recently.  It's an exciting and challenging platform to get your head around and I thought I would just work up an example portlet, spread it across a number of blog entries, and hopefully deliver something useful in the process.

And that's what I'm going to do here.  Not only is the blog going to cover all of the step by step sort of stuff, but the project itself is going to be available on GitHub for you to pull and review at your leisure.

Let's dive in...

The Project

First off, I cannot take credit for the project idea, I'm saying that up front so there's no confusion.

I was recently working in an environment where developers had no direct access to the development server - no way to view or edit files, no way to download log files to review what was going on, no way to tweak property files, ...  It wasn't so much that we couldn't actually do these kinds of things (the sys admin would let us look at their terminal viewing files, the SA would make whatever changes we asked, the SA would give us the logs as is, ...), the developers were just not allowed to have command line access on the server.

So this planted a seed that started with the question, has anyone exposed access to the filesystem in the portal before?

Actually they have.  I found Jan Eerdekens' blog about a portlet he created:

So cool, there was precedent, but Jan's portlet is for Liferay 6.2, and that wasn't going to work for me with 7.

So the seed has grown into the blog project: Create a filesystem access portlet.

So now I have a plant (the project), but it needs some nurturing to get it to bear fruit.

The Specifications

Our portlet has a basic set of specifications that it must have:

  • Must be able to list files/folders from the filesystem.
  • Must be able to view file contents within the browser, preferably with syntax highlighting.
  • Must be able to edit file contents within the browser, preferably with syntax highlighting.
  • Must be able to add new files and folders.
  • Must be able to delete files and folders.
  • Must be able to 'touch' files and folders, updating the last modified dates for the files and folders.
  • Must be able to download files and directories (as zip files) from the server.
  • Must be able to upload files to the server.
  • Must be configurable so these features can be disabled, allowing an admin to choose a subset of these features which are allowed.

That's a pretty sweet set of specifications.  The good news is that they are all pretty easy to knock out.

The Requirements

There are some additional requirements for our project:

  • Must run on Liferay 7 CE GA2 (or newer).
  • Must use Gradle for the build tool.
  • Must be full-on OSGi modules, no legacy stuff.
  • Must leverage Lexicon.
  • Must leverage the new Configuration facilities.
  • Must leverage the new Liferay MVC portlet framework.
  • Must provide new Panel Application (side bar) support.

So these are the real fun parts of the project.  We're going to be using the new Liferay build tools.  We're going to use OSGi.  We're going to leverage the new Liferay MVC framework and all of the other goodies.  We're going to see what role Lexicon plays in our app.

One additional requirement - the project must be available on GitHub.  This is going to be our tool for learing LR7 development, so we're going to share.


Well, that's it for part 1.  The project has been introduced, so we all know where we're going and whether the outcome is successful.

Look forward to seeing you in the next installment...


Liferay 7, Service Builder and External Databases

Technical Blogs July 13, 2016 By David H Nebinger

So I'm a long-time supporter of ServiceBuilder.  I saw its purpose way back on Liferay 4 and 5 and have championed it in the forums and here in my blog.

With the release of Liferay 7, ServiceBuilder has undergone a few changes mostly related to the OSGi modularization.  ServiceBuilder will now create two modules, one API module (comparable to the old service jar w/ the interfaces) and a service module (comparable to the service implementation that used to be part of a portlet).

But at it's core, it still does a lot of the same things.  The service.xml file defines all of the entities, you "buildService" (in gradle speak) to rebuild the generated code, consumers still use the API module and your implementation is encapsualted in the service module.  The generated code and the Liferay ServiceBuilder framework are built on top of Hibernate so all of the same Spring and Hibernate facets still apply.  All of the features used in the past are also supported, including custom SQL, DynamicQuery, custom Finders and even External Database support.

External Database support is still included for ServiceBuilder, but there are some restrictions and setup requirements that are necessary to make them work under Liferay 7.

Examples are a good way to work through the process, so I'm going to present a simple ServiceBuilder component that will be tracking logins in an HSQL database separate from the Liferay database.  That last part is obviously contrived since one would not want to go to HSQL for anything real, but you're free to substitute any supported DB for the platform you're targeting.

The Project

So I'll be using Gradle, JDK 1.8 and Liferay CE 7 GA2 for the project.  Here's the command to create the project:

blade create -t servicebuilder -p com.liferay.example.servicebuilder.extdb sb-extdb

This will create a ServiceBuilder project with two modules:

  • sb-extdb-api: The API module that consumers will depend on.
  • sb-extdb-service: The service implementation module.

The Entity

So the first thing we need to define is our entity.  The service.xml file is in the sb-extdb-service module, and here's what we'll start with:

<?xml version="1.0"?>
<!DOCTYPE service-builder PUBLIC "-//Liferay//DTD Service Builder 7.0.0//EN" "">

<service-builder package-path="com.liferay.example.servicebuilder.extdb">

  <!-- Define a namespace for our example -->

  <!-- Define an entity for tracking login information. -->
  <entity name="UserLogin" uuid="false" local-service="true" remote-service="false" data-source="extDataSource" >
  <!-- session-factory="extSessionFactory" tx-manager="extTransactionManager" -->

    <!-- userId is our primary key. -->
    <column name="userId" type="long" primary="true" />

    <!-- We'll track the date of last login -->
    <column name="lastLogin" type="Date" />

    <!-- We'll track the total number of individual logins for the user -->
    <column name="totalLogins" type="long" />

    <!-- Let's also track the longest time between logins -->
    <column name="longestTimeBetweenLogins" type="long" />

    <!-- And we'll also track the shortest time between logins -->
    <column name="shortestTimeBetweenLogins" type="long" />

This is a pretty simple entity for tracking user logins.  The user id will be the primary key and we'll track dates, times between logins as well as the user's total logins.

Just as in previous versions of Liferay, we must specify the external data source for our entity/entities.

ServiceBuilder will create and manage tables only for the Liferay DataBase.  ServiceBuilder will not manage the tables, indexes, etc. for any external databases.

In our particular example we're going to be wiring up to HSQL, so I've taken the steps to create the HSQL script file with the table definition as:


The Service

The next thing we need to do is build the services.  In the sb-extdb-service directory, we'll need to build the services:

gradle buildService

Eventually we're going to build out our post login hook to manage this tracking, so we can guess that we could use a method to simplify the login tracking.  Here's the method that we'll add to

public class UserLoginLocalServiceImpl extends UserLoginLocalServiceBaseImpl {
  private static final Log logger = LogFactoryUtil.getLog(UserLoginLocalServiceImpl.class);

  * updateUserLogin: Updates the user login record with the given info.
  * @param userId User who logged in.
  * @param loginDate Date when the user logged in.
  public void updateUserLogin(final long userId, final Date loginDate) {
    UserLogin login;

    // first try to get the existing record for the user
    login = fetchUserLogin(userId);

    if (login == null) {
      // user has never logged in before, need a new record
      if (logger.isDebugEnabled()) logger.debug("User " + userId + " has never logged in before.");

      // create a new record
      login = createUserLogin(userId);

      // update the login date

      // initialize the values

      // add the login
    } else {
      // user has logged in before, just need to update record.

      // increment the logins count
      login.setTotalLogins(login.getTotalLogins() + 1);

      // determine the duration time between the current and last login
      long duration = loginDate.getTime() - login.getLastLogin().getTime();

      // if this duration is longer than last, update the longest duration.
      if (duration > login.getLongestTimeBetweenLogins()) {

      // if this duration is shorter than last, update the shortest duration.
      if (duration < login.getShortestTimeBetweenLogins()) {

      // update the last login timestamp

      // update the record

After adding the method, we'll need to build services again for the method to get into the API.

Defining The Data Source Beans

So we now need to define our data source beans for the external data source.  We'll create an XML file, ext-db-spring.xml, in the sb-extdb-service/src/main/resources/META-INF/spring directory.  When our module is loaded, the Spring files in this directory will get processed automatically into the module's Spring context.

<?xml version="1.0"?>


    NOTE: Current restriction in LR7's handling of external data sources requires us to redefine the
    liferayDataSource bean in our spring configuration.

    The following beans define a new liferayDataSource based on the jdbc.ext. prefix in
  <bean class="com.liferay.portal.dao.jdbc.spring.DataSourceFactoryBean" id="liferayDataSourceImpl">
    <property name="propertyPrefix" value="jdbc.ext." />

  <bean class="org.springframework.jdbc.datasource.LazyConnectionDataSourceProxy" id="liferayDataSource">
    <property name="targetDataSource" ref="liferayDataSourceImpl" />

    So our entities are all appropriately tagged with the extDataSource, we'll alias the above
    liferayDataSource so it matches the entities.

  <alias alias="extDataSource" name="liferayDataSource" />

These bean definitions are a big departure from the classic way of using an external data source.  Previously we would define separate data source beans from the Liferay Data Source beans, but under Liferay 7 we must redefine the Liferay Data Source to point at our external data source.

This has a couple of important side effects:

Only one data source can be used in a single ServiceBuilder module.  If you have three different external data sources, you must create three different ServiceBuilder modules, one for each data source.
The normal Liferay transaction management limits the scope of transactions to the current module.  To manage transactions that cross ServiceBuilder modules, you must define and use XA transactions.

The last line, the alias line, this line defines a Spring alias for the liferayDataSource as your named data source in your service.xml file.

So, back to our example.  We're planning on writing our records into HSQL, so we need to add the properties to the for our external datasource connection:

# Connection details for the HSQL database

The Post Login Hook

So we'll use blade to create the post login hook.  In the sb-extdb main directory, run blade to create the module:

blade create -p com.liferay.example.servicebuilder.extdb.event -t service -s sb-extdb-postlogin

Since blade doesn't know we're really adding a sub module, it has created a full standalone gradle project.  While not shown here, I modified a number of the gradle project files to make the postlogin module a submodule of the project.

We'll create the com.liferay.example.servicebuilder.extdb.event.UserLoginTrackerAction with the following details:

 * class UserLoginTrackerAction: This is the post login hook to track user logins.
 * @author dnebinger
  immediate = true, property = {""},
  service = LifecycleAction.class
public class UserLoginTrackerAction implements LifecycleAction {

  private static final Log logger = LogFactoryUtil.getLog(UserLoginTrackerAction.class);

   * processLifecycleEvent: Invoked when the registered event is triggered.
   * @param lifecycleEvent
   * @throws ActionException
  public void processLifecycleEvent(LifecycleEvent lifecycleEvent) throws ActionException {

    // okay, we need the user login for the event
    User user = null;

    try {
      user = PortalUtil.getUser(lifecycleEvent.getRequest());
    } catch (PortalException e) {
      logger.error("Error accessing login user: " + e.getMessage(), e);

    if (user == null) {
      logger.warn("Could not find the logged in user, nothing to track.");


    // we have the user, let's invoke the service
    getService().updateUserLogin(user.getUserId(), new Date());

    // alternatively we could just use the local service util:
    // UserLoginLocalServiceUtil.updateUserLogin(user.getUserId(), new Date());

   * getService: Returns the user tracker service instance.
   * @return UserLoginLocalService The instance to use.
  public UserLoginLocalService getService() {
    return _serviceTracker.getService();

  // use the OSGi service tracker to get an instance of the service when available.
  private ServiceTracker<UserLoginLocalService, UserLoginLocalService> _serviceTracker =;

Checkpoint: Testing

At this point we should be able to build and deploy the api module, the service module and the post login hook module.  We'll use the gradle command:

gradle build

In each of the submodules you'll find a build/libs directory where the bundle jars are.  Fire up your version of LR7CEGA2 (make sure the jdbc.ext properties are in file before starting) and put the jars in the $LIFERAY_HOME/deploy folder.  Liferay will pick them up and deploy them.

Drop into the gogo shell and check your modules to ensure they are started.

Log into the portal a few times and you should be able to find the database in the data directory and browse the records to see what it contains.


Using external data sources with Liferay 7's ServiceBuilder is still supported.  It's still a great tool for building a db-based OSGi module, still allows you to generate a bulk of the DB access code while encapsulating behind an API in a controlled manner.

We reviewed the new constraints on ServiceBuilder imposed by Liferay 7:

  • Only one (external) data source per Service Builder module.
  • The external data source objects, the tables, indexes, etc., must be manually managed.
  • For a transaction to span multiple Service Builder modules, XA transactions must be used.

You can find the GitHub project code for this blog here:

OSGi Module Dependencies

Technical Blogs July 6, 2016 By David H Nebinger

It's going to happen.  At some point in your LR7 development, you're going to build a module which has runtime dependencies.  How do you satisfy those dependencies though?

In this brief blog entry I'll cover some of the options available...

So let's say you have a module which depends upon iText (and it's dependencies).  It doesn't really matter what your module is doing, but you have this dependency and now have to figure out how to satisfy it.

Option 1 - Make Them Global

This is probably the easiest option but is also probably the worst.  Any jars that are in the global class loader (Tomcat's lib and lib/ext, for example), are classes that can be accessed anywhere, including within the Liferay OSGi container.

But global jars have the typical global problems.  Not only do they need to be global, but all of their dependencies must also be global.  Also global classes are the only versions available, you can't really vary them to allow different consumers to leverage different versions.

Option 2 - Let OSGi Handle Them

This is the second easiest option, but it's likely to not work.  If you declare a runtime dependency in your module and if OSGi has a bundle that satisfies the dependency, it will be automatically available to your module.

This will work when you know the dependency can be satisfied, either because you're leveraging something the portal provides or you've actually deployed the dependency into the OSGi container (some jars also conveniently include OSGi bundle information and can be deployed directly into the container).

For our example, however, it is unlikely that iText will have already been deployed into OSGi as a module, so relying on OSGi to inject it may not end well.

Declaring the runtime dependency is going to be handled in your build.gradle file.  Here's a snippet for the iText runtime dependency:

runtime group: 'com.iowagie', name: 'itext', version: '1.4.8'

If iText (and it's dependencies) have been successfully deployed as an OSGi bundle, your runtime declaration will ensure it is available to your module.  If iText is not available, your module will not start and will report unsatisfied dependencies.

Option 3 - Make An Uber Module

Just like uber jars, uber modules will have all of the dependent classes exploded out of their original jars and are available within the module jar.

This is actually quite easy to do using Gradle and BND.

In your build.gradle file, you should declare your runtime dependencies just as you did for Option 2.

To make the uber module, you also need to include the resources in your bnd.bnd file:

Include-Resource: @itext-1.4.8.jar

So here you include the name of the dependent jar, usually you can see what it is when Gradle is downloading the dependency or by browsing your maven repository.

Note that you must also include any dependent jars in your include statement.  For example, iText 2.0.8 has dependencies on BouncyCastle mail and prov, so those would need to be added:

Include-Resource: @itext-2.0.8.jar,@bcmail-138.jar,@bcprov-138.jar

You may need to add these as runtime dependencies so Gradle will have them available for inclusion.

If you use a zip tool to crack open your module jar, you'll see that all of the individual jars have been exploded and all classes are in the jar.

Option 4 - Include the Jars in the Module

The last option is to include the jars in the module itself, not as an uber module, but just containing the jar files within the module jar.

Similar to option 2 and 3, you will declare your runtime dependencies in the build.gradle file.

The bulk of the work is going to be done in the bnd.bnd file.

First you need to define the Bundle-ClassPath attribute to include classes in the module jar but also the extra dependency jars.  In the example below, I'm indicating that my iText jar will be in a lib directory within the module jar:


Rather than use the Include-Resource header, we're going to use the -includeresource directive to pull the jars into the bundle:


In this format we're saying that lib/itext.jar will be pulled in from itext-1.4.8.jar (which is one of our runtime dependencies so Gradle will have it available for the build).

This format also supports the use of wildcards so you can leave version selection to the build.gradle file.  Here's an example for including any version of commons-lang:


If you use a zip tool to crack open your module jar, you'll find there are jars now in the bundle under the lib directory.


So which of these options should you choose?  As with all things Liferay, it depends.

The global option is easy as long as you don't need different versions of jars but have a lot of dependencies on the jar.  For example, if you had 20 different modules all dependent upon iText 1.4.8, global may be the best path with regards to runtime resource consumption.

Option 2 can be an easy solution if the dependent jar is also an OSGi bundle.  In this case you can allow for multiple versions and don't have to worry about bnd file editing.

Option 3 and 4 are going to be the most common route to choose however.  In both of these cases your dependencies are included within the module so the OSGi's class loader is not polluted with different versions of dependent jars.  They are also environment-agnostic; since the modules contain all of their dependencies, the environment does not need to be prepared prior to module deployment.

Personally I stick with Option 4 - uber jars will tend to step on each other when expanding the jars that contain a same path/file in each (usually xml or config info).  Option 4 doesn't suffer from these sorts of issues.


Using Gravatars...

General Blogs October 14, 2015 By David H Nebinger

So I'm here late one night and I'm wondering why Liferay is hosting profile pics...

I mean, in this day and age we all have accounts all over the place and many of those places also support profile pics/avatars.  In each of these sites when we create accounts we go and pull our standard avatar pic in and move on.

Recently, however, I was working with a site that used an avatar web service and I thought "wow, this is cool.".

Basically the fine folks at have created this site where they host avatar images for you.  As a user you create an account (keyed off your email address) and then load your avatar pic.

Then, on any site that points to gravatar and knows your email address, they can display your gravatar image.  You don't have to have a copy on the separate sites anymore!

And from a user perspective, you get the benefit of being able to change your avatar on one site and all of the other gravatar-aware sites will update automatically.

So my first thought was how I could bring this to Liferay...

Liferay Avatars

So the Liferay profile pics/avatars are all handled by the getPortraitURL() method of the User model class.  At first blush this would seem like an easy way to override Liferay's default portrait URL handling.

All we need to do is create a class which extends UserWrapper and overrides the getPortraitURL() method.  I'm including the code below that will actually do the job:

 * GRAVATAR_BASE_URL: The base URL to use for non-secure gravatars.
public static final String GRAVATAR_BASE_URL = "";
 * GRAVATAR_SECURE_BASE_URL: The base URL to use for secure gravatars.
public static final String GRAVATAR_SECURE_BASE_URL = "";

 * getPortraitURL: Overriding method to return the gravatar URL instead of the portal's URL.
 * @param themeDisplay
 * @return String The gravatar URL.
 * @throws PortalException
 * @throws SystemException
public String getPortraitURL(ThemeDisplay themeDisplay) throws PortalException, SystemException {
	String emailAddress = getEmailAddress();

	String hash = "00000000000000000000000000000000";

	if ((emailAddress == null) || (emailAddress.trim().length() < 1)) {
		// no email address

	} else {
		hash = md5Hex(emailAddress.trim().toLowerCase());

	String def = super.getPortraitURL(themeDisplay);

	StringBuilder sb = new StringBuilder();

	boolean secure = StringUtil.equalsIgnoreCase(Http.HTTPS, PropsUtil.get(PropsKeys.WEB_SERVER_PROTOCOL));

	if (secure) {
	} else {

	// add the hash value

	if ((def != null) && (def.trim().length() > 0)) {
		// add the default param
		try {
			String url = URLEncoder.encode(def.trim(), "UTF-8");
		} catch (UnsupportedEncodingException e) {

	return sb.toString();

 * hex: Utility function from gravatar to hex a byte array.
 * @param array
 * @return String The hex string.
 * @link
public static String hex(byte[] array) {
	StringBuffer sb = new StringBuffer();

	for (int i = 0; i < array.length; ++i) {
		sb.append(Integer.toHexString((array[i]	& 0xFF) | 0x100).substring(1,3));

	return sb.toString();

 * md5Hex: Utility function to create an MD5 hex string from a message.
 * @param message
 * @return String The md5 hex.
 * @link
public static String md5Hex (String message) {
	try {
		MessageDigest md = MessageDigest.getInstance("MD5");
		return hex (md.digest(message.getBytes("CP1252")));
	} catch (NoSuchAlgorithmException e) {
	} catch (UnsupportedEncodingException e) {

	return null;

The great part about this implementation is that if the user does not have a gravatar account, the Liferay profile pic will end up being used instead.

Liferay Service Wrappers

So now you have a UserWrapper extension class that uses, but how do you get Liferay to use it?

Well you create one or more service wrappers and override the necessary methods so your UserWrapper will be returned.

Sounds easy, right?  Well it isn't.

The first obvious wrapper to create is the UserLocalServiceWrapper extension class.  That's where the User records are returned, so that's a good place to start.

First we'll add some support methods to handle wrapping the users:

 * wrap: Handles the wrapping of a single instance.
 * @param user
 * @return User The wrapped user.
protected User wrap(final User user) {
	if (user == null) return null;

	// if the user is already wrapped, no reason to wrap again, just return it.
	if (ClassUtils.isAssignable(user.getClass(), GravatarUserWrapper.class)) {
		return user;

	return new GravatarUserWrapper(user);

 * wrap: Handles the wrapping of multiple instances in a list.
 * @param users
 * @return List The list of wrapped users.
protected List<User> wrap(final List<User> users) {
	if ((users == null) || (users.isEmpty())) return users;

	// create a list to put the wrapped users in to return.
	List<User> toReturn = new ArrayList<User>(users.size());

	for (User user : users) {

	return toReturn;

That's actually the easy part.  The next part is to attempt override of each method that returns a User instance or a List of users:


public User addDefaultAdminUser(long companyId, String screenName, String emailAddress, Locale locale, String firstName, String middleName, String lastName) throws PortalException, SystemException {
	return wrap(super.addDefaultAdminUser(companyId, screenName, emailAddress, locale, firstName, middleName, lastName));

public User addUser(long creatorUserId, long companyId, boolean autoPassword, String password1, String password2, boolean autoScreenName, String screenName, String emailAddress, long facebookId, String openId, Locale locale, String firstName, String middleName, String lastName, int prefixId, int suffixId, boolean male, int birthdayMonth, int birthdayDay, int birthdayYear, String jobTitle, long[] groupIds, long[] organizationIds, long[] roleIds, long[] userGroupIds, boolean sendEmail, ServiceContext serviceContext) throws PortalException, SystemException {
	return wrap(super.addUser(creatorUserId, companyId, autoPassword, password1, password2, autoScreenName, screenName, emailAddress, facebookId, openId, locale, firstName, middleName, lastName, prefixId, suffixId, male, birthdayMonth, birthdayDay, birthdayYear, jobTitle, groupIds, organizationIds, roleIds, userGroupIds, sendEmail, serviceContext));

public User addUser(User user) throws SystemException {
	return wrap(super.addUser(user));

public User addUserWithWorkflow(long creatorUserId, long companyId, boolean autoPassword, String password1, String password2, boolean autoScreenName, String screenName, String emailAddress, long facebookId, String openId, Locale locale, String firstName, String middleName, String lastName, int prefixId, int suffixId, boolean male, int birthdayMonth, int birthdayDay, int birthdayYear, String jobTitle, long[] groupIds, long[] organizationIds, long[] roleIds, long[] userGroupIds, boolean sendEmail, ServiceContext serviceContext) throws PortalException, SystemException {
	return wrap(super.addUserWithWorkflow(creatorUserId, companyId, autoPassword, password1, password2, autoScreenName, screenName, emailAddress, facebookId, openId, locale, firstName, middleName, lastName, prefixId, suffixId, male, birthdayMonth, birthdayDay, birthdayYear, jobTitle, groupIds, organizationIds, roleIds, userGroupIds, sendEmail, serviceContext));

public User fetchUser(long userId) throws SystemException {
	return wrap(super.fetchUser(userId));

public User fetchUserByEmailAddress(long companyId, String emailAddress) throws SystemException {
	return wrap(super.fetchUserByEmailAddress(companyId, emailAddress));

public User fetchUserByFacebookId(long companyId, long facebookId) throws SystemException {
	return wrap(super.fetchUserByFacebookId(companyId, facebookId));

public User fetchUserById(long userId) throws SystemException {
	return wrap(super.fetchUserById(userId));

public User fetchUserByOpenId(long companyId, String openId) throws SystemException {
	return wrap(super.fetchUserByOpenId(companyId, openId));

public User fetchUserByScreenName(long companyId, String screenName) throws SystemException {
	return wrap(super.fetchUserByScreenName(companyId, screenName));

public User fetchUserByUuidAndCompanyId(String uuid, long companyId) throws SystemException {
	return wrap(super.fetchUserByUuidAndCompanyId(uuid, companyId));

public List<User> getCompanyUsers(long companyId, int start, int end) throws SystemException {
	return wrap(super.getCompanyUsers(companyId, start, end));

public User getDefaultUser(long companyId) throws PortalException, SystemException {
	return wrap(super.getDefaultUser(companyId));

public List<User> getGroupUsers(long groupId) throws SystemException {
	return wrap(super.getGroupUsers(groupId));

public List<User> getGroupUsers(long groupId, int start, int end) throws SystemException {
	return wrap(super.getGroupUsers(groupId, start, end));

public List<User> getGroupUsers(long groupId, int start, int end, OrderByComparator orderByComparator) throws SystemException {
	return wrap(super.getGroupUsers(groupId, start, end, orderByComparator));

public List<User> getInheritedRoleUsers(long roleId, int start, int end, OrderByComparator obc) throws PortalException, SystemException {
	return wrap(super.getInheritedRoleUsers(roleId, start, end, obc));

public List<User> getNoAnnouncementsDeliveries(String type) throws SystemException {
	return wrap(super.getNoAnnouncementsDeliveries(type));

public List<User> getNoContacts() throws SystemException {
	return wrap(super.getNoContacts());

public List<User> getNoGroups() throws SystemException {
	return wrap(super.getNoGroups());

public List<User> getOrganizationUsers(long organizationId) throws SystemException {
	return wrap(super.getOrganizationUsers(organizationId));

public List<User> getOrganizationUsers(long organizationId, int start, int end) throws SystemException {
	return wrap(super.getOrganizationUsers(organizationId, start, end));

public List<User> getOrganizationUsers(long organizationId, int start, int end, OrderByComparator orderByComparator) throws SystemException {
	return wrap(super.getOrganizationUsers(organizationId, start, end, orderByComparator));

public List<User> getRoleUsers(long roleId) throws SystemException {
	return wrap(super.getRoleUsers(roleId));

public List<User> getRoleUsers(long roleId, int start, int end) throws SystemException {
	return wrap(super.getRoleUsers(roleId, start, end));

public List<User> getRoleUsers(long roleId, int start, int end, OrderByComparator orderByComparator) throws SystemException {
	return wrap(super.getRoleUsers(roleId, start, end, orderByComparator));

public List<User> getSocialUsers(long userId1, long userId2, int start, int end, OrderByComparator obc) throws PortalException, SystemException {
	return wrap(super.getSocialUsers(userId1, userId2, start, end, obc));

public List<User> getSocialUsers(long userId1, long userId2, int type, int start, int end, OrderByComparator obc) throws PortalException, SystemException {
	return wrap(super.getSocialUsers(userId1, userId2, type, start, end, obc));

public List<User> getSocialUsers(long userId, int start, int end, OrderByComparator obc) throws PortalException, SystemException {
	return wrap(super.getSocialUsers(userId, start, end, obc));

public List<User> getSocialUsers(long userId, int type, int start, int end, OrderByComparator obc) throws PortalException, SystemException {
	return wrap(super.getSocialUsers(userId, type, start, end, obc));

public List<User> getTeamUsers(long teamId) throws SystemException {
	return wrap(super.getTeamUsers(teamId));

public List<User> getTeamUsers(long teamId, int start, int end) throws SystemException {
	return wrap(super.getTeamUsers(teamId, start, end));

public List<User> getTeamUsers(long teamId, int start, int end, OrderByComparator orderByComparator) throws SystemException {
	return wrap(super.getTeamUsers(teamId, start, end, orderByComparator));

public User getUser(long userId) throws PortalException, SystemException {
	return wrap(super.getUser(userId));

public User getUserByContactId(long contactId) throws PortalException, SystemException {
	return wrap(super.getUserByContactId(contactId));

public User getUserByEmailAddress(long companyId, String emailAddress) throws PortalException, SystemException {
	return wrap(super.getUserByEmailAddress(companyId, emailAddress));

public User getUserByFacebookId(long companyId, long facebookId) throws PortalException, SystemException {
	return wrap(super.getUserByFacebookId(companyId, facebookId));

public User getUserById(long companyId, long userId) throws PortalException, SystemException {
	return wrap(super.getUserById(companyId, userId));

public User getUserById(long userId) throws PortalException, SystemException {
	return wrap(super.getUserById(userId));

public User getUserByOpenId(long companyId, String openId) throws PortalException, SystemException {
	return wrap(super.getUserByOpenId(companyId, openId));

public User getUserByPortraitId(long portraitId) throws PortalException, SystemException {
	return wrap(super.getUserByPortraitId(portraitId));

public User getUserByScreenName(long companyId, String screenName) throws PortalException, SystemException {
	return wrap(super.getUserByScreenName(companyId, screenName));

public User getUserByUuidAndCompanyId(String uuid, long companyId) throws PortalException, SystemException {
	return wrap(super.getUserByUuidAndCompanyId(uuid, companyId));

public List<User> getUserGroupUsers(long userGroupId) throws SystemException {
	return wrap(super.getUserGroupUsers(userGroupId));

public List<User> getUserGroupUsers(long userGroupId, int start, int end) throws SystemException {
	return wrap(super.getUserGroupUsers(userGroupId, start, end));

public List<User> getUserGroupUsers(long userGroupId, int start, int end, OrderByComparator orderByComparator) throws SystemException {
	return wrap(super.getUserGroupUsers(userGroupId, start, end, orderByComparator));

public List<User> getUsers(int start, int end) throws SystemException {
	return wrap(super.getUsers(start, end));

public List<User> search(long companyId, String firstName, String middleName, String lastName, String screenName, String emailAddress, int status, LinkedHashMap<String, Object> params, boolean andSearch, int start, int end, OrderByComparator obc) throws SystemException {
	return wrap(, firstName, middleName, lastName, screenName, emailAddress, status, params, andSearch, start, end, obc));

public List<User> search(long companyId, String keywords, int status, LinkedHashMap<String, Object> params, int start, int end, OrderByComparator obc) throws SystemException {
	return wrap(, keywords, status, params, start, end, obc));

public User updateAgreedToTermsOfUse(long userId, boolean agreedToTermsOfUse) throws PortalException, SystemException {
	return wrap(super.updateAgreedToTermsOfUse(userId, agreedToTermsOfUse));

public User updateEmailAddress(long userId, String password, String emailAddress1, String emailAddress2) throws PortalException, SystemException {
	return wrap(super.updateEmailAddress(userId, password, emailAddress1, emailAddress2));

public User updateEmailAddress(long userId, String password, String emailAddress1, String emailAddress2, ServiceContext serviceContext) throws PortalException, SystemException {
	return wrap(super.updateEmailAddress(userId, password, emailAddress1, emailAddress2, serviceContext));

public User updateEmailAddressVerified(long userId, boolean emailAddressVerified) throws PortalException, SystemException {
	return wrap(super.updateEmailAddressVerified(userId, emailAddressVerified));

public User updateFacebookId(long userId, long facebookId) throws PortalException, SystemException {
	return wrap(super.updateFacebookId(userId, facebookId));

public User updateIncompleteUser(long creatorUserId, long companyId, boolean autoPassword, String password1, String password2, boolean autoScreenName, String screenName, String emailAddress, long facebookId, String openId, Locale locale, String firstName, String middleName, String lastName, int prefixId, int suffixId, boolean male, int birthdayMonth, int birthdayDay, int birthdayYear, String jobTitle, boolean updateUserInformation, boolean sendEmail, ServiceContext serviceContext) throws PortalException, SystemException {
	return wrap(super.updateIncompleteUser(creatorUserId, companyId, autoPassword, password1, password2, autoScreenName, screenName, emailAddress, facebookId, openId, locale, firstName, middleName, lastName, prefixId, suffixId, male, birthdayMonth, birthdayDay, birthdayYear, jobTitle, updateUserInformation, sendEmail, serviceContext));

public User updateJobTitle(long userId, String jobTitle) throws PortalException, SystemException {
	return wrap(super.updateJobTitle(userId, jobTitle));

public User updateLastLogin(long userId, String loginIP) throws PortalException, SystemException {
	return wrap(super.updateLastLogin(userId, loginIP));

public User updateLockout(User user, boolean lockout) throws PortalException, SystemException {
	return wrap(super.updateLockout(user, lockout));

public User updateLockoutByEmailAddress(long companyId, String emailAddress, boolean lockout) throws PortalException, SystemException {
	return wrap(super.updateLockoutByEmailAddress(companyId, emailAddress, lockout));

public User updateLockoutById(long userId, boolean lockout) throws PortalException, SystemException {
	return wrap(super.updateLockoutById(userId, lockout));

public User updateLockoutByScreenName(long companyId, String screenName, boolean lockout) throws PortalException, SystemException {
	return wrap(super.updateLockoutByScreenName(companyId, screenName, lockout));

public User updateModifiedDate(long userId, Date modifiedDate) throws PortalException, SystemException {
	return wrap(super.updateModifiedDate(userId, modifiedDate));

public User updateOpenId(long userId, String openId) throws PortalException, SystemException {
	return wrap(super.updateOpenId(userId, openId));

public User updatePassword(long userId, String password1, String password2, boolean passwordReset) throws PortalException, SystemException {
	return wrap(super.updatePassword(userId, password1, password2, passwordReset));

public User updatePassword(long userId, String password1, String password2, boolean passwordReset, boolean silentUpdate) throws PortalException, SystemException {
	return wrap(super.updatePassword(userId, password1, password2, passwordReset, silentUpdate));

public User updatePasswordManually(long userId, String password, boolean passwordEncrypted, boolean passwordReset, Date passwordModifiedDate) throws PortalException, SystemException {
	return wrap(super.updatePasswordManually(userId, password, passwordEncrypted, passwordReset, passwordModifiedDate));

public User updatePasswordReset(long userId, boolean passwordReset) throws PortalException, SystemException {
	return wrap(super.updatePasswordReset(userId, passwordReset));

public User updatePortrait(long userId, byte[] bytes) throws PortalException, SystemException {
	return wrap(super.updatePortrait(userId, bytes));

public User updateReminderQuery(long userId, String question, String answer) throws PortalException, SystemException {
	return wrap(super.updateReminderQuery(userId, question, answer));

public User updateScreenName(long userId, String screenName) throws PortalException, SystemException {
	return wrap(super.updateScreenName(userId, screenName));

public User updateStatus(long userId, int status, ServiceContext serviceContext) throws PortalException, SystemException {
	return wrap(super.updateStatus(userId, status, serviceContext));

public User updateUser(User user) throws SystemException {
	return wrap(super.updateUser(user));

public User updateUser(long userId, String oldPassword, String newPassword1, String newPassword2, boolean passwordReset, String reminderQueryQuestion, String reminderQueryAnswer, String screenName, String emailAddress, long facebookId, String openId, String languageId, String timeZoneId, String greeting, String comments, String firstName, String middleName, String lastName, int prefixId, int suffixId, boolean male, int birthdayMonth, int birthdayDay, int birthdayYear, String smsSn, String aimSn, String facebookSn, String icqSn, String jabberSn, String msnSn, String mySpaceSn, String skypeSn, String twitterSn, String ymSn, String jobTitle, long[] groupIds, long[] organizationIds, long[] roleIds, List<UserGroupRole> userGroupRoles, long[] userGroupIds, ServiceContext serviceContext) throws PortalException, SystemException {
	return wrap(super.updateUser(userId, oldPassword, newPassword1, newPassword2, passwordReset, reminderQueryQuestion, reminderQueryAnswer, screenName, emailAddress, facebookId, openId, languageId, timeZoneId, greeting, comments, firstName, middleName, lastName, prefixId, suffixId, male, birthdayMonth, birthdayDay, birthdayYear, smsSn, aimSn, facebookSn, icqSn, jabberSn, msnSn, mySpaceSn, skypeSn, twitterSn, ymSn, jobTitle, groupIds, organizationIds, roleIds, userGroupRoles, userGroupIds, serviceContext));

Okay, although this seems like a lot of methods, at least we're done now, right?

Eh, not so fast...

There are still some holes in our implementation.  For example, any DynamicQuery that is returning users would not be wrapped.  Any custom queries elsewhere that are returning users would not be wrapped.  Any other service that is returning a User or Users that don't go through the UserLocalService, well those would not be wrapped either.


But actually I think we're in pretty good shape.  We're hitting most of the major points where User objects are being returned, all of those users will be appropriately wrapped to return the gravatar URLs for the portrait pics...

A final suggestion I would leave with you - if you're using you may not want to support changing profile pics anymore.  All you should need to do is add to your file a line like:[portrait]=

Any user that has an email address that matches the domain specified in this line will be able to update the portrait.  When it's empty like above, users should not be able to edit the portrait.

Should, but not definite, because there's other field.editable properties which have higher priority than the specific field check, like field.editable.roles=administrator allows admins to edit all fields including the portrait image.

But that is just a matter of training your admins not to edit the portraits and you should be fine.

Anyway you'll want to build this as a service hook plugin and deploy into your environment and your users should start seeing their gravatar profile pics.


Vaadin 7 Control Panel is Compatible with Vaadin 7.5.0

General Blogs July 6, 2015 By David H Nebinger

Just a quick note...

Vaadin 7.5.0 was released at the end of June, 2015.

I've completed an initial round of testing, and Vaadin 7 Control Panel version is totally compatible with 7.5.0.  All themes compiled and so did the widgetsets.

If you run into any problems with 7.5.0 compatibility, please let me know...

Content Creation is Not a Development Activity!

General Blogs June 26, 2015 By David H Nebinger

Let me say that again:

Content Creation is Not a Development Activity!

Pretty strong statement, but for for those considering Liferay as a platform it's really an important concept, one that lives in the core of Liferay.

Note: This is really a different kind of blog post for me.  I prefer to keep things technical and discuss practical solutions to common problems.  But I've actually seen this problem come up in many projects and I'm sure many of you have seen it too.  So forgive me while I climb up on my soap box and preach for a while...

The Authors Dilema

Since the first cave drawings, authors have been able to express themselves using basic tools to put thought to paper (or some other transferrable media).  The only barriers to creating content was the availability of the basic instruments.

In the modern era, publishing became a structured process that applied to most physical print media.  Whether a book, a magazine or a newspaper, the following roles are involved:

  • Writer - Responsible for authoring the content.
  • Editor - Approves content and edits for length, content, format, syntax and spelling.
  • Illustrator - Provides supporting images if it's illustrated.
  • Production Designer - Handles page layout and general design.
  • Printer - Generates the final physical media.

With the invention of the world wide web, publishing on the web still needed to follow this standard process (in lieu of the printer, instead you have the web browser).  After all, it is still publishing even though it's on a digital platform.

The dilema for these roles for web publishing is that the simple, basic tools that were used in the past were replaced with complex computer languages and required software developers (and often development teams).  Even the initial HTML-only pages still required computer knowledge, tools and skills, things that the average content authors don't have.

Unfortunately, adding software developers into the content authoring process brought with it the software development process.

Publishing and the Software Development Process

After a lot of pain and suffering, software development lifecycles and processes were defined and implemented to reduce or eliminate bugs.  An experienced developer knows and appreciates the separate environments for development, testing, UAT and production.  A development team's goal is to push the promotion of the software artifacts through each successive enviroment and eventually make it to production.

The publishing process actually overlays onto this development process without too much contention, especially given the original nature of web publishing.

Remember way back in 1995 when we first started creating web pages?  I do so I guess that dates me a little.  At that point in time we were creating complete HTML pages with content wrapped in <b>, <i>, <u> and (god forbid) <blink> tags.  We'd connect all of the pages with simple <a> tags and had some <img> tags too.  Pages were extremely simple, but more important, an HTML page contained everything.

We quickly moved away from simple HTML and started doing some JSP pages, some PHP pages, some perl, some ruby, some <insert favorite web language here>...  We added CSS to style the content and JavaScript to liven it up.

Even with this shift, however, the files contained everything.  Even with web applications, REST, javascript SPAs, etc. we still have the same basic concept that the files contained everything.

So with web publishing, the artifacts created by the developer could be promoted through the environments just as non-web projects.  The classic publishing roles would overlay onto the environments, perhaps in a different order, but in each of the environments the different roles could effect change on the content and, on final approval, can be published on the production environment.

It's been like that for 20 years now, so long that it's practically ingrained in our enterprises that for web content creation/publishing we need to get all those environments lined up and ready for content promotion.

Enter The Liferay

So we come to Liferay with this expectation on how web content publishing works.  I mean, it's been this way for 20 years, so this is the only way it can and should work.  Right?

Well, wrong.  Liferay actually allows the restoration of the classic publishing roles.  Through existing facilities already baked into the Liferay core, a writer can author content.  Through the use of workflow, editors can approve and edit content.  Illustrators can upload images.  Using local or remote staging, production designers can deal with page layouts and design.  The printer doesn't generate physical media, but they can be responsible for publishing the staging content to the live site.

Great!  We've restored the classic publishing process, so everything is puppies and flowers!

Well, not really.  Liferay really sticks it to the old time web content developers and project managers because it practically eliminates the ability to promote content between environments.  Liferay really doesn't allow you to create content in development and promote it to test, UAT and eventually production.

This seems to blow everyone's minds.  How is a PM supposed to say "yes, the developer did the job correctly and the content is ready for promotion'?  How does a developer provide a software artifact to support marking the end of a phase of the SDLC?  We can't possibly do a real web project on Liferay without support for web content promotion...

The Trouble with LARs

It's usually this point where someone will say "Hey, I can export pages/content as a LAR file from one environment and then import it into another!  I can force Liferay to support web content promotion..."

That's really a problem.  Often it will work in a test case or for an initial short run, but LARs are really the round peg for a "web content promotion" square hole.  In order to get LARs to work in this process, you really end up shoving and pushing and hammering to try to get it to work.

Here's some finer points about LARs that often need to be mentioned when discussing using them in this fashion:

  • They are extremely fragile.  LAR imports can break if you even look at them funny.
  • Failures are reported like idiot lights on a car dashboard.  The errors logged during failures are quite obtuse.  They don't tell you what part of the LAR failed, they don't report what conflicts actually are, and they certainly don't suggest how you fix the problem.
  • LAR imports terminate on first exception.  If your LAR actually has five potential import exceptions, you're not going to know it until you complete a few rounds of export/import/analyze to work through all the issues individually.
  • LARs are tightly coupled to the version of Liferay.  LARs are stamped with the version of Liferay the data was exported from and will not import into a different Liferay version.  And this is not a major version restriction (i.e. from 6.1 to 6.2), this is an exact version match on specific GA or SP release.
  • LARs can be tightly coupled to installed plugins.  Depending upon options selected during LAR export, the LAR file can actually contain data related to deployed plugins; if the plugins are not available on the target system, the import will fail.  For example, exporting a page that has a calendar instance, the LAR will contain info about this; if the target Liferay does not have the calendar plugin installed, the LAR import will fail.  One might expect the page to load but exclude the calendar, but it is treated as an entire failure.
  • LARs do not maintain asset IDs during import.  LARs actually do their work on asset names.  This allows the LAR import to work by reassigning a valid surrogate ID in the system where the assets are imported.  However, if any of your code refers to an asset by ID that code will likely fail in other environments as surrogate ID values will not be guaranteed to match.  So if you use a simple URL to embed an image in web content using the ID, it will need to be adjusted in the target system to get it to work.  These kinds of things can be tricky because there is no failure at LAR import time, it really will only be noticed by someone viewing the imported LAR content in the site.
  • It is super easy to export a LAR that cannot be imported.  This is really frustrating and happens a lot with structures.  When you're exporting a journal article that uses a structure it is normal to include the structure in the LAR.  But after you do this once, the structure would have already been loaded into the target environment (maybe).  So you should stop including the structure in the LAR as this allows the import to work until, of course, you try to import into an environment that doesn't have the structure loaded...
  • LAR created/modified users and times.  This is valuable meta information, but if I give you a LAR that has web content that I created, it won't keep that information in your system unless I'm a user in your environment.
  • Big LAR files fail to load for mysterious reasons.  Often they are due to some of the previously raised points, but other times they just outright fail and don't give you a reason.

And seriously, the list of issues goes on, but I think you get the point.

Long story short, the idea that LARs can be used to support a long term process of promoting web content between environments is really doomed to failure.  Those of you who are using LARs to support web content promotion, well I'm sure you've already seen some of these issues and, if you haven't yet, well then I'd say so far you have been lucky but predict that luck will run out.

LARs were designed for a specific use case, but web content promotion between environments was not one of them.

Restoring Sanity

No, LARs are not broken, so don't go opening bugs on to get web content promotion working.

What's broken is the view that web content creation must be treated as a development activity.

It's not, and everyone really should be happy about that.  Developers are not web content creators and project managers are not editors nor designers and web content creation is not a project.

So restore sanity in your Liferay installation.  Get the PMs and developers out of the web content creation business.

Set up a workflow process to allow knowledgable folks to review and approve content.  Set up staging (either local or remote) to give folks time to lay things out and have them reviewed and approved.  Manage all of the normal activities that a publishing process requires.

But do all of these things in production.  Don't try to push down to the lower levels of a SDLC setup, and don't try to force web content creation through the promotion process as though it's a software development artifact.

Liferay spent a lot of time and effort bringing these web content creation and publication tools into Liferay so the content creation process could be freed from the grips of the IT department.

Take advantage of this freedom.  It is yours...


Will the Real Font Awesome Please Step Forward...

General Blogs May 29, 2015 By David H Nebinger


In my last blog post I had a little rant about the old version of Font Awesome included with Liferay 6.2 and how it would always be out of date because they seem to keep adding new glyphs every day.

So I wondered if it was possible to use the current Font Awesome in a theme and started down a path to test it out.

Font Awesome

For those that don't know, Font Awesome is basically a custom font made from SVG images.  It's awesome in that SVG images scale to practically any size yet still keep nice curves and won't pixelate on you.

By using Font Awesome, you get a catalog of canned images for all kinds of standard buttons that you see on a desktop application or on a web page plus a long list of others.

In Liferay 6.2 they added Font Awesome 3.2.1 into the mix.  At this point in time it is two years old and, although it still works, the set of available glyphs (361 of them) is smaller than the current 4.3.0's 519 glyphs.

Now I can't really comment on how useful the 519 glyphs are.  I really doubt I'll ever have a need for the new viacoin glyph, for example.  But I do want to know if I need it if I can use it or not.

Creating the Theme

Start by creating a new theme project for Liferay 6.2.  To keep the theme simple, use classic as the parent for the theme.

Download and expand the latest version of Font Awesome; I'm using 4.3.0.

From your Font Awesome directory (mine is font-awesome-4.3.0), copy the css and fonts folder to your theme.  If using Maven, you'll copy them to the src/main/webapp directory of your font.  If using the SDK, you'll copy them to the docroot/_diffs folder.

To keep the theme changes as simple as possible, copy the css/main.css from the _styled theme into your new theme's css folder.  Edit this file to add the font-awesome.css line to the end of the file:

@import url(font-awesome.css);

Once you've tested your theme out, you might want to switch over to font-awesome.min.css, but Liferay will already be minimizing the css on the fly so this is not really necessary.

At this point your theme project is done.  Build and deploy it to prepare for the next step.

Testing Font Awesome

So testing is actually pretty easy.  Create a new page in your portal somewhere (doesn't matter where) and use your new theme for the page and, to keep things really simple, use a 1 column layout.

Drop a Web Content Display portlet on the page and create a new web content article.  For the body of the article, click the "Source" button and set it to the following:

<div><i class="fa fa-3x fa-coffee"></i></div>

Publish the article to see your FA Coffee Cup:

Drop another Web Content Display portlet on the page and create a new article.  For the body of the article, click on the "Source" button and then grab the HTML fragment from the attached file (there's over 500 glyphs so I didn't want to put all of that content here into the blog) and past it in.

And before you say anything, yes I know it's a table and we don't do those anymore because they are not responsive.  I went with a table because this is just to show that Font Awesome is working in the theme, it's not trying to show the best way to put a table into your journal articles.

When you publish the article and return to your portal, voila, you can see all of the available Font Awesome 4.3.0 glyphs!


So it is possible to use the latest Font Awesome in your themes, leveraging all of the latest glyphs.

What's more, FA has some other similar glyphs as fonts from  It just so happens that those, too, can be integrated into a theme in the same way we just added Font Awesome into a theme.  Note that I'm not really selling FontIcons or anything, I'm just providing choice and affirmation that, should you need/want FontIcons, they'll work in your theme too.

So at one point I also wondered if it would be possible to upgrade Alloy's older Font Awesome 3.2.1 to the latest 4.3.0.  The simple answer is no.  For the Font Awesome 4 release, the classes changed from "icon-coffee" to a better namespaced "fa-coffee".  Some of the supported stylings also changed.  So Alloy's use of Font Awesome 3 cannot easily be adapted to use Font Awesome 4.

That said, it cannot be ruled out entirely.  I'm sure if you had enough time and energy, you could either rework all of the Font Awesome 4 class names to match the Font Awesome 3 names and then replace Alloy's older FA files with the hacked versions, but what an ugly hack that is.  A better path but one I haven't tried would be to override how Alloy pulls in and uses the FA 3 files and have it pull in and use the FA 4 files instead (I think all it would take is tweaking the Font Awesome 3 stuff pulled in via aui.css from styles that have _styled as a parent (or grandparent or great-grandparent, etc.) and have it pull in the FA 4 stuff instead).


Creating a Liferay Marketing Theme

General Blogs May 22, 2015 By David H Nebinger


So I'm not the best theme person around.  I'm less a UI guy and more of an implementation person.  Honestly I hate working on themes.  Maybe it's just me, but all of the styling work that you have to do in Liferay, well honestly I find it quite daunting.

And when you throw in Bootstrap for responsiveness, SASS for "simplified" stylesheets, FontAwesome, etc., well for me things just go from bad to worse.

I have my head down in the core all day, why would I want to step out of that clean, organized world (wink) to dip my toes into (what I consider) the mess that is theming?

Well, there's no use complaining about it.  I need a theme for a custom personal site I'm working on.  I envison a main page kind of like what is using and support pages that will probably adhere to more of a classic sort of flavor for child pages.  The main page, for a developer, well that will feel pretty worthless, but it's just one of those things that everyone needs these days and I'm itching for a learning experience, so this blog entry is going to follow my effort for creating such a theme in Liferay.

I'm not alone in this kind of need.  Recently my friend Jack Bakker posted in the forum how to put together a theme like the one Liferay has for and the wordpress-like theme on the child pages such as

So since I was creating a similar theme, I thought I'd capture the how-to about it and include it here in a new blog post.

Starting Point

So along with not being a theme person, well I'm not much of an artist.  I can look at web sites and identify things I like and things I don't like, but if you ask me to create a "standard" marketing page I would spend days scratching out ideas and probably come up empty.

So what I needed was a starting point.  Somewhere that I could grab a workable theme idea.  Something that had the marketing-like concept from the North America Symposium site but where I wasn't really stealing their work.

I remembered a story recently on Slashdot that pointed to a site created by a couple of college students that would generate mock startup websites.  The images, etc are all covered by the Creative Commons, so I figure I can borrow from them since I'm more than willing to give appropriate attributions.

So the site is and each time you click on it you should get a completely fresh website.  Do it a couple of times and you'll see familiar patterns on the pages, from a single image which stays permanently in place while the scrollable area slides over it, a consistent color palette based on image colors, responsive layout based on Bootstrap, leverages FontAwesome, etc.  Basically everything that I'm looking to implement.  Since the site is not based on Liferay, I'm just effectively using the CC images and concepts from their site, but like I said all real credit for the look and feel I give to them.

I flipped through a couple of times in Firefox until I found a basic color palette and background image that would work.  I used Firefox's save feature to save the complete web page so I got images, css, and the main page.

I'm sure you're wondering why I'm spending a lot of time explaining how I came up with the starting theme.  It's not really my goal to show everyone how to grab someone else's ideas.

But for many of you developers out there, I'm now at the point where you will be when some whiz-bang UI developer comes to your cube with a zip file full of her excellent UI designs in simple HTML, CSS, JS and imagery and they say "Here you go, now make Liferay look like this..."

That's hopefully what we'll get out of this blog, a fancy front page theme for Liferay to impress your coworkers and bosses.

Creating the Projects

So we're actually going to be creating three different projects - two theme projects and one layouttpl project.  One theme project will cover the main page and the second theme will provide the wordpressy-like child page themes.  The layouttpl project will allow us to define the separate zones that will exist in the pages and allow the themes to define how the zones will look.

Wanting to leave the IDE out of the picture, we can create our theme projects using "mvn archetype:generate".  Use the "liferay-theme-archetype" to create a new theme projects and give them relevant values.  Use the "liferay-layouttpl-archetype" to create a new layout template project and give it the relevant values.

New theme projects use "_styled" as the parent theme in the pom.xml file; that's a good enough starting place for our front page since most other themes would be overkill for what we're doing.

If you're using the SDK, you should do an initial project build of the themes right off.  This will pull in the theme files from the _styled theme so you just have to create the _diffs folders and start adding your override files.  For Maven developers, you can just start adding your overrides to the src/main/webapp folders and when you build the projects, Maven will take care of merging the files.

I'm using Maven for these projects (as I'm doing for all of my projects these days), so if you're using the SDK you'll have to modify the instructions appropriatiately.  I would recommend that you get off of the SDK as soon as you can since that's the apparent direction Liferay is giving us.

The themes that I'm creating have CSS, images, JS and, of course, their own Velocity templates, so we should create the src/main/webapp/css, src/main/webapp/images, src/main/webapp/js and src/main/webapp/templates folders where our files will go.

Images are further divided by type.  The marketing theme I'm using really only needs the background image (I'm not going to be using the face images), but I'll put it in a src/main/webapp/images/theme folder (use your theme project as the folder name) to keep it (and possibly others) separate from regular theme images.

The layout template project is pretty much ready to start editing right away, so we'll actually tackle that first.

Creating the Layout Template

The layout template is really important for the marketing type themes.  These themes typically take up 100% of the width of the browser (I love that) and a bunch of different rows each styled differently.

From, we see 7 different rows that contain different content: Hero, Quote, Why, Quote, Team, Quote, Sponsors and Footer.  There's also the header (with navigation).  Most of the rows are a single column, but the Why and Team sections are actually 3 column.

We'll mimic the same layout and use as our row types: hero, quote, why, team, sponsors and footer.  These are going to be the basis for our classes for the sections of the layout.  Our final layout is going to be, in Liferay naming venacular, 1-1-3-1-3-1-1-1  cheeky

Here's the content for our marketing layout template file:

<div class="full-screen-layout" id="main-content" role="main">
    <div class="portlet-layout row-fluid vol-hero">
        <div class="portlet-column portlet-column-only span12" id="column-1">
            $processor.processColumn("column-1", "portlet-column-content portlet-column-content-only")
    <div class="portlet-layout row-fluid vol-quote">
        <div class="portlet-column portlet-column-only span12" id="column-2">
            $processor.processColumn("column-2", "portlet-column-content portlet-column-content-only")
    <div class="portlet-layout row-fluid vol-why">
        <div class="portlet-column portlet-column-first span4" id="column-3">
            $processor.processColumn("column-3", "portlet-column-content portlet-column-content-first")

        <div class="portlet-column span4" id="column-4">
            $processor.processColumn("column-4", "portlet-column-content")

        <div class="portlet-column portlet-column-last span4" id="column-5">
            $processor.processColumn("column-5", "portlet-column-content portlet-column-content-last")
    <div class="portlet-layout row-fluid vol-quote">
        <div class="portlet-column portlet-column-only span12" id="column-6">
            $processor.processColumn("column-6", "portlet-column-content portlet-column-content-only")
    <div class="portlet-layout row-fluid vol-team">
        <div class="portlet-column portlet-column-first span4" id="column-7">
            $processor.processColumn("column-7", "portlet-column-content portlet-column-content-first")

        <div class="portlet-column span4" id="column-8">
            $processor.processColumn("column-8", "portlet-column-content")

        <div class="portlet-column portlet-column-last span4" id="column-9">
            $processor.processColumn("column-9", "portlet-column-content portlet-column-content-last")
    <div class="portlet-layout row-fluid vol-quote">
        <div class="portlet-column portlet-column-only span12" id="column-10">
            $processor.processColumn("column-10", "portlet-column-content portlet-column-content-only")
    <div class="portlet-layout row-fluid vol-sponsors">
        <div class="portlet-column portlet-column-only span12" id="column-11">
            $processor.processColumn("column-11", "portlet-column-content portlet-column-content-only")
    <div class="portlet-layout row-fluid vol-footer">
        <div class="portlet-column portlet-column-only span12" id="column-12">
            $processor.processColumn("column-12", "portlet-column-content portlet-column-content-only")

This defines our layout (I've already copied to the wap template so they both match).  The important aspect to notice here is the addition of classes to the divs for the separate rows.  I've prefixed my classes with "vol-" to avoid namespace conflicts, but otherwise you'll see I've used the types that we picked out earlier.

For the child pages such as Why Attend, well this page has specific theming support but uses a standard 1 column layout template, so we don't need a custom template here.

Layout templates are fairly easy to create, but they're just as easy to build, deploy and test.  Build your layout template project and deploy it to your portal (verifying deployment is completed without error).  Now you should be able to log in and use the layout template to test the results.

Using the classic theme and your new layout, you can drop web content displays into each area (Classic gives you a big drop zone for each column).  Here's what I got when I did it locally:

Not very flashy for certain, but it does give you an idea of where we're headed.

Building the Marketing Theme

Next we'll start on the marketing theme.  I'm not really going to get into the CSS stuff myself, I managed to get it all working and frankly this is a Liferay site, not a CSS site, so we'll leave you to find some other source for the CSS details.

I do want to highlight some of the important parts, though, the ones that make the theme work.


So there's two basic forms that the navigation menu take.  The first, such as the one on, has the navigation at the top of the page above the hero image.  The second, such as the one on the 2015 Symposium, has the hero image above the navigation.

The first case is the close to the classic Liferay navigation found in many themes, even classic.  The big difference would be the absence of the big company image in the upper left corner, now there's typically a much smaller image (sized per the nav area) that is part of and to the left of the navigation on the right.  For this implementation, you're really going to be tweaking the portal_normal.vm file to get rid of the big company logo above navigation in lieu of a smaller image that is integrated right into the navigation bar.  You can still stick with Liferay's use of the dynamic company logos as long as you keep em small or something that will scale correctly to the nav bar size.  The rest is just a little bit of HTML and CSS magic to get it to work.

The second option with the hero above the navigation, well that one is a little more interesting.  In this case it's a heck of a lot easier if you can just build your hero stuff right into portal_normal.vm to appear above the normal navigation area.  The 2015 Symposium uses this technique to lay out all the hero image above where the navigation fits in.  What makes this so easy is that you can then stick the nav bar to the top of the page as you scroll and you don't have to worry about embedding a dynamic journal article in the header.  Makes the theme specific for your particular need (i.e. the 2015 Symposium theme can only be used for the 2015 Symposium, you'd have to create a new theme for the 2016 Symposium).

Alternatively you could try to use a dynamic hero section in the content area then use funky CSS and JS tricks to manipulate the dom and get it above the nav bar.  It might allow you to use a dynamic hero, but honestly I'd be reluctant to go down this road because it just seems to fragile to me.  I think you'd always be worried about how to get it to work on all past, present and future browsers, mobile, etc.  I cannot say it's not possible cuz I'm sure it is, I guess I'm saying that you'd probably burn through too much time, money and resources just getting it to work and keeping it working.

For my implementation, I'm going with the first version, the nav bar at the top of the page above the content area.  So my portal_normal.vm file starts like this:

<header id="banner" role="banner">
	#if ($has_navigation || $is_signed_in)
		#parse ("$full_templates_path/navigation.vm")

Wait, where's the logo?  Where's the sign in link stuff?  Well, like I said, my implementation now has those items embedded within the nav bar.  So I've moved them into navigation.vm.  Basically the logo stuff is right before the <ul> tag for navigation and the sign in link stuff is next after the closing </ul> tag.  Since I'm trying to mimic a marketing page that has greatly simplified the menus, I've taken out the second level navigation stuff too.  Note that to mimic the marketing sites that have menus with links to anchors in the page, just add initial static links as <li /> items in the nav bar just before the normal velocity processing for pages.  You may even choose to exclude the regular nav logic altogether moving it instead to, say, a hamburger menu.  The choice is yours.

Without any styling applied, the top of our page now looks like:

Not significantly different, but we have the basis for our actual nav bar.  All we need to do is apply styles to the appropriate elements and we'll be good to go.

The Hero

So the hero, for those that don't know, is that big worthless image at the top of most marketing sites that have no practical purpose but to show some sort of artwork but mostly it will include happy faces of people you're supposed to think really like the products of the company they're on, but most of the time they are just stock images purchased for use on the page.  Hmm, I wonder if that statement will come across as a bit cynical?

The hero image these days is usually placed as a background image.  Sometimes they stay at the top of the page, but the latest trend is to keep the image at the top of the browser window and let the page content scroll overtop of it, often with one transparent box that scrolls over the window and the rest are opaque and hide the background as the user scrolls down in the page.

My implementation is going to use a similarly placed background image that will be at the top of the browser window and will not scroll.  However, the vol-hero row should be transparent so the background shows through and the content of the hero row will scroll over the image.

So how exactly are we going to build this piece?  Well, it's actually pretty simple.  So first I got my background image and added it to the theme.  The full path for the image is /vol-theme/images/vol/metaldeer.jpg.  I deployed my theme so it would be available.

The rest is done within a web content display.  In the web content display for the vol-hero column, I used the following web content:

<div id="hero-007">
  <p style="text-align: center;"><span style="font-size:24px;"><span style="color:#FFFFFF;">This is the hero page.</span></span></p>
  <p style="text-align: center;">&nbsp;</p>
  <p style="text-align: center;">&nbsp;</p>
  <p style="text-align: center;">&nbsp;</p>
  <p style="text-align: center;">&nbsp;</p>
  <p style="text-align: center;">&nbsp;</p>
  <p style="text-align: center;">&nbsp;</p>
  <p style="text-align: center;">&nbsp;</p>
  <p style="text-align: center;"><span style="font-size:24px;"><span style="color:#FFFFFF;">It is quite large.</span></span></p>
  <p style="text-align: center;">&nbsp;</p>

Pretty simple, but I want to highlight the div which surrounds the content - hero-007.

That's important because we're going to use that for applying the styling.  Here's the rule that was used in custom.css in the theme:

#hero-007 {
    background: url(../images/vol/metaldeer.jpg);
    background-position: top center;
    background-repeat: no-repeat;
    background-attachment: fixed;
    background-size: cover;
    min-height: 500px;

If you didn't want to embed this in the theme, another option is to use advanced styling under the look and feel for the web content display portlet.  Load your background image into Docs and Media and get the link to it, that would be the URL to use, otherwise this rule can be used as-is.

Here's how it looks:

Since the background is positioned at the top center and has attachement fixed, it will stay in place as you scroll the page.  Since the lower rows in our layout don't have the background, the image is just the background for the hero area.

A quick note about opacity...  There are some CSS things and JS things you can do to tweak the opacity of the background image.  My big question is simply "why?"  I mean, unless you don't have an image tool on your computer and/or you don't have direct access to the image there's simply no reason not to tweak the image for the right opacity and use that image.  The upside is it will work on all browsers and versions, including mobile, and it won't suffer the fragility of a CSS/JS solution.


After seeing the floating background for the hero, tackling the quote sections will be a breeze.  Here we're going to add a class to our theme for the background and the text styling and we'll use a simple web content display area.  The CSS is both long and boring, so it's left out of the blog, but here's how it looks so far:

Note the cool addition of FontAwesome quotes in here, I didn't have to do a thing but reference it in the web content.

There are two options to get the Font Awesome images, the easiest is to output HTML like <i class="icon-left-quote"></i> and the _styled theme actually took care of the rest.  To see the available images in Liferay 6.2, I went to Nick Piesco's Alloy UI Font Awesome Cheat Sheet.  The other option is to use a content style to set the code for the Font Awesome glyph, although the problem you have here is that some codes from FA do not work because of Liferay's own version.

Note that I am perturbed that I can't use normal Font Awesome classes and that Alloy has to have their own implementation because, of course, it's already out of date and even if they shipped an update tomorrow it would probably be out of date the day after. 

The quote itself is really not very special.  It's a <blockquote /> tag with a two-line right-justified <h2 /> underneath.  The blockquote has styles applied to it from the theme:

blockquote:before, .aui blockquote:before {
    top: -20px;
    left: -10px;
    content: "\201C";
blockquote:after, .aui blockquote:after {
    top: -20px;
    right: 5%;
    content: "\201D";

The important part is the content style, the two codes given are for the left and right quotes respectively.

The Why

So the why row is labeled as such because it supposedly answers the question "Why Us" and has 3 boxes to drop quick imagery that provide the top 3 reasons.  I don't really have reasons, but I will fill in the boxes to see how it looks...

These FontAwesome guys are created by using a div that is styled with a background color circle and an <i class="icon-beer"></i> HTML code (different values for the other icons) to overlay the FontAwesome images.  Again these come from Nick Piesco's Cheat Sheet so any of the listed icon types will work for 6.2.

When laying this out I realized that I wanted a simple header for the 3 cells so I inserted a one-column row above the previous vol-team row in the layout above and dropped this web content in.

The others are basically 3 different web content displays dropped in each of the cells.  Yes, the icons are not centered very well, but that is something that you would fix when tweaking the output.


Sure, the theme is not done, but at this point I think it's clear that I'm just going to keep adding content into web content displays and providing some stuff in the theme to style it the way it should look, to make it the way I want.

I hope, at this point, that it's clear that although some aspects of Liferay theming can be daunting, creating a Marketing theme is not really too difficult.  Trust me, as I said earlier I am not a strong CSS person so if I can do this, you can do it too.

So what's next?  Well, the theme is not very responsive.  I tried shrinking the page and it didn't do what I want so I'm going to look into that.  And I don't like the old dockbar look you get from the _styled theme so I'm going to pull in the classic dockbar.  Finally the nav bar is not sticking at the top of the page when I scroll, so I'll have to flush out the javascript to flip the classes and keep it in place.

Who knows, maybe one or more of these will make it into a future blog post...  wink

Fake ServiceBuilder Entities

General Blogs May 21, 2015 By David H Nebinger


So this is a concept that comes up from time to time on the forums - How do I return a non-database complex object from my ServiceBuilder-generated service layer?

The common answer is usually to bundle the class(es) defining the complex objects into a jar and deploy the jar to the global class loader.  This is a workable solution, but as with all global jars it can only be updated when the app container is shut down.  When organizations go this route, most of the time the simple global jar with model classes tends to grow into a monstrosity jar with the model classes, service classes for dealing with the model classes, etc.  They naturally grow to an unweildy size that require a lot of maintenance and deployment activity.  This leads to runtime issues where each consumer of the business logic classes in the global jar are actually instantiating their own instances in their own web applications so there's multiple runtime objects and no "shared" instance access.

So what starts as a seemingly simple solution can actually get bogged down over time.

An alternate solution that was initially outlined in this forum thread centers around what I call, for lack of a snazzier name, Fake Entities.

What Is a Fake Entity?

So what is a Fake Entity really?  Well it is exactly like a regular ServiceBuilder entity minus the backing database table.  Since it is like a ServiceBuilder entity, though, there's a lot of functionality you get:

  • Local interfaces for working with the entities.
  • Remote interfaces for accessing with regular web services.
  • With the remote interfaces you can layer in security checks to manage access.
  • Since the fake entities are managed by a ServiceBuilder implementation, you have one service providing plugin exposing a single access point for business logic so there can be data and logic sharing by all service consumers.
  • There is reduced resource requirements portal-wide because there is a single instance of the service provider classes.
  • It can be easy/natural to add and use the entities alongside the regular database-backed entities (some restrictions apply wink).
  • Updates to the entity and the service layer are deployed just like any other ServiceBuilder plugin is deployed.

Basically any benefit you can name for a database-backed ServiceBuilder entity usage applies to the fake entities.

So now you're sold on using Fake Entities, and you probably want to know how to do so.

Defining a DataSource

Yes, we still need a DataSource.  ServiceBuilder is dependent upon a DataSource definition for the entity and, if one isn't given, Liferay will use it's own DataSource (and create tables in the Liferay database).  As we don't want to create database tables, we need an alternate DataSource.

From the forum thread, Remis Lima Baima suggests using Spring's org.springframework.jdbc.datasource.LazyConnectionDataSourceProxy as the DataSource.  The LazyConnectionDataSourceProxy is a DataSource implementation that will not connect to a database until a javax.jdbc.Statement object is created.  For our fake entity purposes, as long as we do not attempt to do anything w/ the database side (i.e. the persistence layer or the finders, etc.) it would work.  Obviously, though, the risk is that some developer may try to use the data tier w/o knowing they're fake entities and you have to deal w/ all of the database exceptions that would be thrown as a result of missing tables.

You could protect yourself from this by using an HSQL database connection to back the LazyConnectionDataSourceProxy.  That way if entities do get created they would be in a throwaway database.  The problem here is that eventually the developers might start using them as database entities (since they would work as if they were real database entities), so you become dependent upon a database that is not intended to necessarily be a production database source.

I like the recommendation from my good friend Jack Bakker.  Basically he created a utility class to use to define a DataSource factory that always returns a null DataSource: 

package com.example;

import javax.sql.DataSource;

 * class NoDataSource: A factory class to do nothing but return a null data source object.
 * @author Jack Bakker
public class NoDataSource {
	 * getDataSource: Supposed to return a DataSource.
	 * @return DataSource Always returns null.
	public static DataSource getDataSource() {	
		return null;

I think the brilliance of this idea is that if a developer tries to access the database portion for an entity, it's going to generate a NullPointerException.  While not very ellegant, it will leave little doubt that the database layer is off-limits.

Defining the ext-spring.xml Beans

To use your datasource, you have to have some additional beans defined in the META-INF/ext-spring.xml file for your ServiceBuilder project:


<?xml version="1.0"?>

<beans xmlns=""

	<bean id="noDataSource" class="com.example.NoDataSource" factory-method="getDataSource" />


So we're defining our factory class with the factory method to create a DataSource instance.

Note: On using this in an EE SP 11 environment I got a NullPointerException from com.liferay.portal.jsonwebservice.JSONWebServiceRegistrator.  Apparently this defines a bean named noDataSource correctly for runtime but not available loading as a bean which leads to the NPE.  There's currently a fix in the pipeline to deal with the null bean reference in the registrator, but in the mean time you can actually solve this by creating a "NullDataSource" class that implements javax.sql.DataSource.  In your methods, just return nulls for everything and you're good.  The bean would then become <bean id="noDataSource" class="com.example.NullDataSource" /> and the NPE will go away.

Defining the Entities in service.xml

Next we'll add the entity definitions in our service.xml file:


<!DOCTYPE service-builder PUBLIC "-//Liferay//DTD Service Builder 6.2.0//EN" "">

<service-builder package-path="com.dnebinger.liferay.fake">
	<!-- Define our first fake entity... -->
	<entity name="Foo" uuid="true" local-service="true" remote-service="true" data-source="noDataSource">

		<!-- Even though we're a fake entity, we still need to have one column as a primary key. -->

		<column name="fooId" type="long" primary="true" />

		<!-- Now we can add any other columns to our entity that we want. -->

		<column name="field1" type="String" />
		<column name="field2" type="boolean" />
		<column name="field3" type="int" />
		<column name="field4" type="Date" />
		<column name="field5" type="String" />
	<!-- Define our second fake entity -->
	<entity name="Point" local-service="true" remote-service="true" data-source="noDataSource">
		<column name="x" type="double" primary="true" />
		<column name="y" type="double" primary="true" />

The Foo entity is just a simple example of what you would do in your entity.  The Point entity represents a point on a cartesian plane.  We'll be using points to create a small library of geometric functions based on the Point.

Building Services

Before you can add methods to your implementation class, you need to build the services.  There's nothing special in our entities so building services in this case is a simple process.

It is during the building of services where you may encounter issues in your entity definitions.

One common error is leaving out a primary column.  Even though these entities will not be backed by a database, ServiceBuilder still requires that at least one column is identified as the primary column.  Doesn't matter which one(s) you use, but they will be part of the constructor for the instance.

Otherwise the same rules for database-backed entities enforced by ServiceBuilder are also apply to our fake entities.

Adding Business Methods

Building services gives us the initial service layer, although the default SB methods shouldn't be used (since we don't have a database) so they are not very helpful.  So let's add some methods to our new com.dnebinger.liferay.fake.service.impl.PointLocalServiceImpl class.

I hate exposing things outside of the service layer that I don't need to.  Since our Point has a compound key, the normal createPoint() method takes a PointPK instance.  Instead of requiring the service consumer to know about the PointPK class, I prefer to add a utility method that overloads the creation:

 * createPoint: Utility constructor to hide the PointPK class.
 * @param x
 * @param y
 * @return Point The newly constructed point.
public Point createPoint(double x, double y) {
	return createPoint(new PointPK(x,y));

Since these are fake entities, you might want to override the default methods to throw exceptions should some unknowing developer try to invoke the class:

 * @see com.dnebinger.liferay.fake.service.base.PointLocalServiceBaseImpl#addPoint(com.dnebinger.liferay.fake.model.Point)
public Point addPoint(Point point) throws SystemException {
	throw new SystemException("Points cannot be persisted.");

It's a lot of busywork to override all of the default methods, but it can save you some headaches in the future.

But we're really here to create some business classes.  I've added three simple methods to find the slope of a line, the distance of a line and calculate the area of a triangle formed by three points:

 * findSlope: Finds the slope of a line for two points, m = (y1 - y2) / (x1 - x2).
 * @param p1
 * @param p2
 * @return double The slope of the line.
public double findSlope(final Point p1, final Point p2) {
	final double m = (p1.getY() - p2.getY()) / (p1.getX() - p2.getX());

	return m;

 * findDistance: Finds the distance of a line formed by the two points, uses the distance formula, d = sqrt((x2 - x1)^2 + (y2 - y1)^2).
 * @param p1
 * @param p2
 * @return double The distance of the line.
public double findDistance(final Point p1, final Point p2) {
	final double x2 = Math.pow((p2.getX() - p1.getX()), 2);
	final double y2 = Math.pow((p2.getY() - p1.getY()), 2);

	final double d = Math.sqrt(x2 + y2);

	return d;

 * findArea: Finds the area of a triangle formed by the three points using Heron's formula:
 * @param p1
 * @param p2
 * @param p3
 * @return
public double findArea(final Point p1, final Point p2, final Point p3) {
	// we actually need the lengths of the three sides...
	final double a = findDistance(p1, p2);
	final double b = findDistance(p2, p3);
	final double c = findDistance(p3, p1);

	// you also need the value for s
	final double s = 0.5 * (a + b + c);

	// and then you can calculate the area.
	final double area = Math.sqrt(s * (s - a) * (s - b) * (s - c));

	return area;

Not very useful or exciting, but it is just meant to demonstrate how we can build a library of functions around our fake entities.

We could also create a singleton instance of point that is available to all service consumers:

 * cachedPoint: A shared point that will be visible across every service consumer.
protected Point cachedPoint = null;
 * getCachedPoint: Returns the cached point.
 * @return Point The cached point.
public Point getCachedPoint() {
	return cachedPoint;
 * setCachedPoint: Sets the cached point.
 * @param p
public void setCachedPoint(final Point p) {
	cachedPoint = p;

Build Services Again

Since we've added methods to our PointLocalServiceImpl class, we have to build services again to add them to the service jar.

We can then use the service jar in other Liferay plugins to access the shared service tier.


So I've introduced fake entities and showed you how to create them, but you may still be wondering how they may apply to you.

The answer comes down to the real reason for ServiceBuilder's existence.  No, ServiceBuilder is not just a broken ORM or anything that you may have heard or thought on your own.

ServiceBuilder's raison d'etre is to provide a service layer that can be shared with many service consumers even across the web application container's class loader boundary.

I may get some pushback on this, but trust me it's the best way to think of ServiceBuilder.  It explains why it is a crippled ORM, why it is normal to include business logic in the service implementation, why it has built-in support for web services and JSON.  It explains why there is one plugin that is providing the service and multiple copies of the service jar floating in the environment with a thin API to access the shared service layer.

When you see ServiceBuilder in this light, it becomes clear that you will want to use entities for a lot more than just your database classes.

Have a set of web services that multiple portlet projects will need to invoke?  If you wrap them in a ServiceBuilder implementation, those portlets only need a service jar.  Your implementation classes can hide all of the details in creating and using the web service client.  For changes to the web service, well you just need to make a change to your service provider, the service consumers can remain unchanged (assuming you don't need some sort of API change).

Have a non-database resource that multiple portlet projects will need to read/write?  If you wrap them in a ServiceBuilder implementation, the service consumers can share access to the resource.  Service consumer A can write something to the resource and the other service consumers can read the value back.  The service provider can choose to cache the changes (if updating the resource is a time-intensive process) and since the service consumers are pulling from the service provider the cached value(s) can be returned.

Building a complex library of business logic to be shared with multiple plugins?  If you wrap it in a ServiceBuilder implementation, you get all of the benefits of using a shared implementation w/o runtime implications of multiple instanciated objects.

Have a Hibernate or JPA data access layer you need to use in multiple plugins?  If you try to copy the code to the separate plugins, they all need their own database connections, their own ehcaches, etc.  If you wrap this in a ServiceBuilder implementation, the service provider becomes a proxy to your data access layer implementation.  Service consumers leverage the service jar to access the service provider; the service provider is the only plugin that requires the database connection, the ehcache, etc.  Since the service consumers are all going through the same instance, issues with stale caches and dirty reads/writes disappear.

All of these things become possible in ServiceBuilder with the introduction of the Fake Entity.

Vaadin Pro Tools and Liferay

General Blogs May 5, 2015 By David H Nebinger

So Vaadin has a complete set of basic widgets (checkboxes, input fields, comboboxes, etc.) and some advanced controls (date picker, menu bar, table, etc.).

They also have the Add-On directory which contains a slew of professional and user contributed widgets (some of my favorites are the masked text fields).

But Vaadin also has the Pro Tools - snazzy widgets with a lot of capabilities, but at a cost.  These widgets aren't free, but they are certainly not outrageous even for small shops or projects.

About Vaadin Pro Tools

There are currently four Pro Tools available:

  • Charts
  • Spreadsheet
  • TestBench
  • TouchKit

Charts is a charting package that combines Vaadin and HighCharts to deliver a responsive charting package that allows for a lot of interaction possibilities between a browser-based chart and server-side logic and data access.

Spreadsheet is just what it sounds like, it's an implementation of a spreadsheet widget that runs right in the browser complete with support for functions, editing, etc.

TestBench is an extension for JUnit which allows for UI validation from a CI perspective.  Rather than requiring a set of eyeballs to review the visual changes in a build, TestBench can use screen caps to compare and identify changes that are inconsistent or unexpected.

TouchKit is a framework for building mobile-compatible Vaadin applications.  A fresh spin on the "write once, run anywhere" concept, TouchKit allows Vaadin applications to act as native mobile apps even though they are still web-based.

Liferay and Pro Tools

So the Pro Tools would be great if we were interested in building servlets, but we're building portlets here.  Can we leverage any of these tools in Liferay?

The answer, my friend, is "Yes we can!".  Well, sort of.

So I've built portlets using Charts and Spreadsheets, but I haven't done anything with TestBench or TouchKit.

Honestly I'm not sure how much sense TestBench and TouchKit make in a portal setup since Vaadin only owns a portion of the page.  Maybe they work, maybe they don't, I wouldn't know until I tried.

There are two important caveats for using the Pro Tools in Liferay:

  1. A license file is necessary for the server (or each node in a cluster) to compile the widgetset with the Pro Tool widget.  For Liferay, you don't have to get a license per developer as listed on the Pro Tools license site because the developers are not compiling the widgetset, only the server (or nodes in the cluster) are.
  2. When compiling the widgetset, the Vaadin compiler will perform an internet call to verify the license.  This verification is outside of Liferay's control and therefore will not use Liferay's proxy settings for internet web access.  The server must have internet web access to the servers.  Make sure your network admin allows this kind of access from the server before undertaking this path.

Charts and Spreadsheets do work, however, so we'll start there.

Getting Vaadin Charts

The Vaadin Charts Overview provides an overview for the kinds of things you can do with Vaadin Charts, and the online demo shows what types of charts are available.

You can download the Vaadin Charts jar from the Add-On Directory (choose the Install/Download link on the right side, then click the download tab for the link to the zip file that contains the charts jar).  Expand the zip file so you'll have access to the jar file.

While you're on the directory page, be sure to click the Activate button to download your unique 30 day license code.

Vaadin 7 Control Panel Update

The first thing you're going to want to do is install/upgrade the Vaadin 7 Control Panel version  At the time this blog is published, the version is sitting in Liferay QA status for a week, but it should be out soon.  The new version has support for the Pro Tools as well as support for Vaadin 7.4.  Note that if this is the first time you've deployed the control panel you must restart your app server.

Compiling the WidgetSet

The Pro Tool widgets are basically Vaadin Add-Ons, but they have extra requirements for licenses which must be present at widgetset compile time.

This means that the license is only necessary when you compile the widgetset, but it is not necessary for runtime use of the Add-On.  So you need a license (well, one per server node) so the widgetset can be built as new widgets are added or a new Vaadin version is used, but otherwise the runtime use of the widget is unlimited (save whatever resource limitations you might have).

Before compiling the widgetset, we must deploy the Charts Add-On.

Go to the Vaadin 7 Control Panel and choose Add-Ons from the right side.  Click the "Choose File" button and locate the vaadin-charts-2.0.0.jar file from the downloaded zip earlier.  Click the "Upload Add-On" button to get to the confirmation page:

Deploy Charts Add On

Click "Yes" to deploy the Charts Add-On.  You'll then get to the deployed Add-Ons:

Vaadin Charts Selected

Normally you'd now be able to go and compile the widgetset with your new widgets and all would be fine.  But the Pro Tools require a license, so we'll add that by clicking Settings on the right side.

Enter Charts License

Remember the temporary license file you downloaded earlier?  Open the file using your favorite text editor and paste it into the input field labeled Vaadin Charts License.

Now you can click the "Compile WidgetSet" button and all will be fine.  If you get a license error during the compile, likely the charts license was not entered correctly or your server does not have internet access.

Building a Chart Portlet

So the shared environment is ready to go, so let's now create a simple chart portlet to see it all in action...

In order to take IDE preference out of the mix, I'm going to do everything w/ Maven directly.  Using an IDE works too, but your preferred IDE may be different from mine so if I show you the command line version, you should be able to make it work in your environment.

First we'll start a Maven project using the liferay-portlet-archetype:

mvn archetype:generate -DarchetypeGroupId=com.liferay.maven.archetypes -DarchetypeArtifactId=liferay-portlet-archetype -DarchetypeVersion=6.2.2 -DgroupId=com.dnebinger.vaadin -DartifactId=vaadin-charts-demo -Dversion=

This will give us a basic Liferay portlet project for Liferay MVC, but we'll be dumping Liferay MVC for the Vaadin shared environment.  We'll start by adding the following to the pom.xml file:

<!-- Add a repository definition so we can reference the Vaadin Add Ons -->
<!-- I tend to use commons lang to keep the code simpler -->

The portlet.xml file in the newly created project needs to be tweaked for our new Vaadin portlet.  Change the portlet.xml content to be:

<?xml version="1.0"?>

<portlet-app xmlns="" xmlns:xsi="" xsi:schemaLocation="" version="2.0">
		<!-- Normal Vaadin portlets will use this class as the portlet class -->

		<!-- Specify the main UI class which will be instantiated by the portlet -->
		<!-- Specify the use of the shared portal widgetset -->


The liferay-portlet.xml file needs some minor changes for Vaadin compatibility:

		<!-- Instanceable indicates whether multiple copies are allowed on the same page. -->
		<!-- ajaxable should always be false for Vaadin portlets -->

Because we're using the shared environment and provided scope in the pom.xml file, we need to add dependencies into the file:

author=Liferay, Inc.

# First the vaadin jars, then the addon jar(s), then spring jars.

Next is to develop the ChartsDemoUI class.  Below is one that I put together from the Vaadin charts demo site source for a pie chart:

package com.dnebinger.vaadin.charts.demo;

import java.util.Arrays;
import java.util.Random;

import com.vaadin.addon.charts.Chart;
import com.vaadin.addon.charts.model.ChartType;
import com.vaadin.addon.charts.model.Configuration;
import com.vaadin.addon.charts.model.DataSeries;
import com.vaadin.addon.charts.model.DataSeriesItem;
import com.vaadin.addon.charts.model.Labels;
import com.vaadin.addon.charts.model.PlotOptionsPie;
import com.vaadin.addon.charts.model.YAxis;
import com.vaadin.addon.charts.themes.ValoLightTheme;
import com.vaadin.annotations.Theme;
import com.vaadin.server.VaadinRequest;
import com.vaadin.ui.Component;
import com.vaadin.ui.UI;
import com.vaadin.ui.VerticalLayout;

public class ChartsDemoUI extends UI {

	private VerticalLayout mainLayout;

	protected void init(VaadinRequest request) {
		// create the main vertical layout
		mainLayout = new VerticalLayout();

		// give it some dimensions

		// set the layout as the content area.

    private static Random rand = new Random(0);
    private static Color[] colors = new ValoLightTheme().getColors();

    protected Component getChart() {
        Component ret = createChart();
        return ret;

    public static Chart createChart() {
        rand = new Random(0);

        Chart chart = new Chart(ChartType.PIE);

        Configuration conf = chart.getConfiguration();

        conf.setTitle("Browser market share, April, 2011");

        YAxis yaxis = new YAxis();
        yaxis.setTitle("Total percent market share");

        PlotOptionsPie pie = new PlotOptionsPie();


        DataSeries innerSeries = new DataSeries();
        PlotOptionsPie innerPieOptions = new PlotOptionsPie();
        innerPieOptions.setDataLabels(new Labels());
                "this.y > 5 ? : null");
        innerPieOptions.getDataLabels().setColor(new SolidColor(255, 255, 255));

        Color[] innerColors = Arrays.copyOf(colors, 5);
        innerSeries.setData(new String[] { "MSIE", "Firefox", "Chrome",
                "Safari", "Opera" }, new Number[] { 55.11, 21.63, 11.94, 7.15,
                2.14 }, innerColors);

        DataSeries outerSeries = new DataSeries();
        PlotOptionsPie outerSeriesOptions = new PlotOptionsPie();
        outerSeriesOptions.setDataLabels(new Labels());
                        "this.y > 1 ? ''+ +': '+ this.y +'%' : null");

        DataSeriesItem[] outerItems = new DataSeriesItem[] {
                /* @formatter:off */
                new DataSeriesItem("MSIE 6.0", 10.85, color(0)),
                new DataSeriesItem("MSIE 7.0", 7.35, color(0)),
                new DataSeriesItem("MSIE 8.0", 33.06, color(0)),
                new DataSeriesItem("MSIE 9.0", 2.81, color(0)),
                new DataSeriesItem("Firefox 2.0", 0.20, color(1)),
                new DataSeriesItem("Firefox 3.0", 0.83, color(1)),
                new DataSeriesItem("Firefox 3.5", 1.58, color(1)),
                new DataSeriesItem("Firefox 3.6", 13.12, color(1)),
                new DataSeriesItem("Firefox 4.0", 5.43, color(1)),
                new DataSeriesItem("Chrome 5.0", 0.12, color(2)),
                new DataSeriesItem("Chrome 6.0", 0.19, color(2)),
                new DataSeriesItem("Chrome 7.0", 0.12, color(2)),
                new DataSeriesItem("Chrome 8.0", 0.36, color(2)),
                new DataSeriesItem("Chrome 9.0", 0.32, color(2)),
                new DataSeriesItem("Chrome 10.0", 9.91, color(2)),
                new DataSeriesItem("Chrome 11.0", 0.50, color(2)),
                new DataSeriesItem("Chrome 12.0", 0.22, color(2)),
                new DataSeriesItem("Safari 5.0", 4.55, color(3)),
                new DataSeriesItem("Safari 4.0", 1.42, color(3)),
                new DataSeriesItem("Safari Win 5.0", 0.23, color(3)),
                new DataSeriesItem("Safari 4.1", 0.21, color(3)),
                new DataSeriesItem("Safari/Maxthon", 0.20, color(3)),
                new DataSeriesItem("Safari 3.1", 0.19, color(3)),
                new DataSeriesItem("Safari 4.1", 0.14, color(3)),
                new DataSeriesItem("Opera 9.x", 0.12, color(4)),
                new DataSeriesItem("Opera 10.x", 0.37, color(4)),
                new DataSeriesItem("Opera 11.x", 1.65, color(4))
                /* @formatter:on */

        conf.setSeries(innerSeries, outerSeries);

        return chart;

     * Add a small random factor to a color form the vaadin theme.
     * @param colorIndex
     *            the index of the color in the colors array.
     * @return the new color
    private static SolidColor color(int colorIndex) {
        SolidColor c = (SolidColor) colors[colorIndex];
        String cStr = c.toString().substring(1);

        int r = Integer.parseInt(cStr.substring(0, 2), 16);
        int g = Integer.parseInt(cStr.substring(2, 4), 16);
        int b = Integer.parseInt(cStr.substring(4, 6), 16);

        double opacity = (50 + rand.nextInt(95 - 50)) / 100.0;

        return new SolidColor(r, g, b, opacity);

Now we can build and deploy our shiny new Vaadin Charts portlet.  Drop it on a page and you'll get the following:

Sample Chart

The other charts from the Vaadin demo site,, can also be used.


Certainly the code for the chart may seem convoluted, but it is not all that bad.  Had I written the code instead of just copying the example code from, I would have added more comments to explain what was going on.  Let me just say that although the code seems convoluted, it is easy to create much simpler charts.

It's important to note that, just like all other Vaadin code, the chart is written and implemented completely in Java.  You don't have to worry about HTML, Javascript, etc.  You get a responsive chart that looks great and is backed by whatever data you gather to drive it (be it jdbc queries, data coming from ServiceBuilder or the Liferay APIs, etc).

As indicated earlier, I have created Vaadin portlets that use the Charts widget as well as the Spreadsheet widget.  Maybe I'll put together another blog post with a spreadsheet portlet, but for now take my word for it that it works.

A final word about Licensing...  The way that the Vaadin Pro Tools licenses work, by default you purchase a license per developer.  The developer, with the license, can build a Vaadin project where the widgetset is compiled for the project (either a Vaadin servlet or a Vaadin portlet from the Vaadin archetypes).  The compiled project can be built and deployed to an application server or cluster with no additional licensing issues.

Since the shared Vaadin environment compiles the widgetset on each node, a license is needed per instance when compiling the widgetset with one (or more) of the Pro Tools.  Note, however, that since the developers are not compiling the widgetset for the deployed project, developers themselves do not need the license, only the servers do.

Since the license applies to the compiled widgetset, not a compiled project, you can use unlimited instances of the widgets in your portlets.  1,000 charts?  Sure, no extra cost to build and display them.


Auto Vaadin Theme Deployment

General Blogs February 10, 2015 By David H Nebinger


So in two previous blogs I introduced Auto Vaadin Add On Deployment and Vaadin 7 Theming.  At the end of the Vaadin 7 Theming blog I said that there was a similar mechanism for handling the deployment of your themes using the Vaadin 7 Control Panel, the mechanism introduced in the Auto Vaadin Add On Deployment blog.

So this blog is going to cover that mechanism by showing how to handle Auto Vaadin Theme Deployment.


Follow the instructions from the Auto Vaadin Add On Deployment blog through the addition of the required-deployment-contexts entry in the file, then come back here...


We also need to add a startup action class that extends one of the classes from the Vaadin 7 Control Panel.  The class is below:

package com.dnebinger.liferay.vaadin.sample;

import com.dnebinger.liferay.vaadin.startup.AbstractVaadinThemeDeployStartupAction;
import org.apache.commons.lang.StringUtils;

import java.util.ArrayList;
import java.util.List;

 * SampleThemeDeployStartupAction: A sample class to demonstrate how to do automatic theme deployment.
 * @author dnebinger 
public class SampleThemeDeployStartupAction extends AbstractVaadinThemeDeployStartupAction {
   private static final String SAMPLE_THEME = "";
    * getThemeNames: Returns the names of the theme zip files from the WEB-INF/vaadin/themes directory that should be deployed.
    * @return List The list of theme names.
   protected List<String> getThemeNames() {
      List<String> names = new ArrayList<String>(1);


      return names;

    * getThemeVersion: Returns the version number to associate with the theme.
    * @param themeName Name of the theme file to get the version number for.
    * @return int The version number for the theme file.
   protected int getThemeVersion(String themeName) {
      if (StringUtils.isBlank(themeName))
      return 0;

      if (SAMPLE_THEME.equals(themeName)) {
         return 100;

      return 0;

This class does two things:

  1. It returns the list of themes to be deployed.
  2. It provides the version number to use for the theme.

To get the theme zip file, create and test your theme using the Vaadin 7 Control Panel.  When you are satisfied with the theme, use the Export button to export the theme as a zip file on your desktop.

The zip file must be added to your project in the src/main/webapp/WEB-INF/vaadin/themes folder.

The theme version number is used to determine if the theme has already been deployed.  If you are shipping an updated version of the theme, be sure to increment the version number.

We also need to add the src/webapp/WEB-INF/liferay-hook.xml file to add the hook to the portlet:

<?xml version="1.0"?>
<!DOCTYPE hook PUBLIC "-//Liferay//DTD Hook 6.2.0//EN" "">


We also need to create the src/resources/ file:

# - Properties for the hook to add the app startup action.

That's it!


With these changes in place, when you build and deploy the portlet, when it is started in the application container the startup action will process the theme file.

If it has either not been deployed before or if the version is greater than the currently installed version (considered a theme upgrade), the theme will be copied to the portal's html/VAADIN/themes directory and compiled to be ready for immediate use.

If it has already been deployed or the version is less than or equal to the deployed version, nothing happens (since it's already included).

So this eliminates the prework involved with deploying and compiling the themes, so you're back to just deploying your snazzy new Vaadin 7 portlet.

Just as with the automatic Add On deployment, you can include your theme with your project instead of treating them as separate deployments.

Updating Vaadin 7 Shared Environment

General Blogs February 8, 2015 By David H Nebinger


Vaadin 7 is an active project still in development.  New releases come out regularly with bug fixes (mostly browser compatibility).  New versions come out with new functionality, etc.

When using the Vaadin 7 shared environment, the version of Vaadin must be the same across the portal.

The reason is the compiled WidgetSets.  When you compile the WidgetSet, the compiled JavaScript is for the version of Vaadin that is being compiled against.  If you compile the WidgetSet using Vaadin 7.2.7, the code will not work so well against 7.2.6 or 7.3.

Keeping the version of Vaadin in sync in your portal has been difficult, but you have a tool to help you complete this task, the Vaadin 7 Control Panel.

Upgrading Vaadin

So upgrading the Vaadin 7 shared environment takes a couple of steps:

  1. Download the new version of Vaadin.
  2. Replace the existing Vaadin theme/widgetset currently in the portal with the new files from the new distribution.
  3. Recompile the custom themes and the widgetset using the deployed Add Ons.
  4. Replace the Vaadin jars for all Vaadin portlets with the version from the distribution.
  5. Restart the portal so the new jars take effect.

Except for the last step, The Vaadin 7 Control Panel will take care of all of these things automatically, you just have to enable a setting and then follow the upgrade procedure.

Enabling the Portlet Auto Update Feature

The portlet Auto Update feature is enabled/disabled in the Settings page of the Vaadin 7 Control Panel:


Check the "Update Portlets during version change." to enable the Auto Update feature.

NOTE: I recommend checking the "Automatically Compile WidgetSet on version change." checkbox.  Otherwise you must update and then manually compile the widgetset.  Since both must be done, why not save yourself some clicks and do it all at once?

Choosing The Version

The Overview page shows the currently installed Vaadin 7 version as well as the current stable version.  Note that to get this information, your server makes an HTTP request to; if your firewall does not allow this outbound traffic, you will not see a value for the stable version.

Click the Change Version link to change the version.

Again to populate this list, your server will open an HTTP connection to  If your firewall will not allow this connection, you will not get a list to display from.

That's okay, though, just follow the directions in the note to download the Vaadin 7 archive from manually and upload it using the form at the bottom.

Select the version from the list that you want to install:

After selecting your version, click the Change Version button to go to the next page.

NOTE: There are alpha, beta, pre-release and nightly builds of Vaadin available for download, but the Vaadin 7 Control Panel does not show these for installation.  You can use the manual approach to install one of these versions if you'd like.  Personally I don't believe that you should be trying to use non-stable versions of Vaadin in your portal (you will have enough to worry about using the stable versions).

You have one more chance to reconsider the upgrade.  Click Cancel to return to the Overview page or Change Version to start the upgrade process.

The upgrade process will update the status message at the top and the progress bar so you know what is going on during the upgrade.  A detailed log window appears below to provide all information relative to the upgrade steps.

When the upgrade process is done, you can click Done to return to the Overview page, but if you haven't enabled automatic WidgetSet compilation, click on the Compile Widgetset button and complete the widgetset compile.


When you are done with the upgrade process, the Overview page will reflect the upgraded version of Vaadin:

Note that after upgrading Vaadin 7, you should restart the application server.  This ensures that the correct version of Vaadin is being used by all Vaadin portlets.

As you can see, upgrading Vaadin 7 using the Vaadin 7 Control Panel just couldn't be easier.  And this is something you're going to want to do to keep up with all of the fixes for supported browsers.


Vaadin 7 HighCharts Portlet

General Blogs February 8, 2015 By David H Nebinger


So the last few Vaadin posts were on some basic introductory stuff, but not a whole lot of fun.  I thought I'd do a blog post on a quick and fun little Vaadin 7 portlet, one that uses HighCharts.

So I'm creating a portlet to show a pie chart of registered user ages (based on birthdate).  The fun part about this portlet is that it will use the Liferay API to access the user birthdates, but will use a DynamicQuery to check the birthdates, and we'll break it up into seven groups and display as a pie chart using HighCharts.  And the whole thing will be written for the Vaadin framework.

Project Setup

So first thing you need is the HighCharts for Vaadin 7 Add On.  You can use the Vaadin 7 Control Panel to deploy the Add On to your environment.  This Add On includes the Vaadin 7 widget for integrating with HighCharts in your Vaadin application, but you also need the HighCharts JavaScript file.

Initially I tried to use the version that comes with the Add On but quickly found that it was not going to work.  The JavaScript shipped with the Add On uses jQuery, but not jQuery in no conflict mode.  So that version of the HighCharts JavaScript won't work in the portal.

Fortunately, however, there is a viable alternative.  From the HighCharts Download page, one of the adapter options includes "Standalone framework" mode.  This adapter option does not rely on jQuery or any other JavaScript framework.

HighCharts Download

Select the Standalone Framework option and any other portions to include in the download you want.  The portlet being built in this blog uses the Pie Chart, so be sure to include that chart type.  I would selecting everything, that way you'll have a complete HighCharts JavaScript file to use in future charting portlets.

If you do not want the minified JavaScript, uncheck the "Compile code" checkbox before downloading.

Download the JavaScript file and save it for later.

Creating the Project

Since I'm using Intellij, I started a Maven project using the "liferay-portlet-archetype", similarly to the project I did for the "vaadin-sample-portlet" project.  I called the portlet and the project "vaadin-user-age-chart-portlet".

To create the portlet, there will be four classes:

  • UsersChartUI - This is the UI class for the portlet, it extends com.vaadin.ui.UI class.
  • DateRange - This is a utility class that calculates and keeps track of a date range.
  • UserAgeData - This is a utility class which retrieves the age data from the Liferay API.
  • UserPieChart - This is the extension class responsible for rendering the pie chart.


Let's start with the DateRange class.  It basically has the following:

package com.dnebinger.liferay.vaadin.userchart;

import java.util.Calendar;
import java.util.Date;

 * DateRange: A container for a date range.
 * Created by dnebinger on 2/6/15.
public class DateRange implements Serializable{
   private static final long serialVersionUID = 58547944852615871L;
   private Calendar startDate;
   private Calendar endDate;

    * DateRange: Constructor for the instance.
    * @param type
   public DateRange(final int type) {

      endDate = Calendar.getInstance();
      startDate = Calendar.getInstance();

      switch (type) {
         case UserAgeData.AGE_0_10:
            // end date is now

            // start date is just shy of 11 years ago.
            startDate.add(Calendar.YEAR, -11);
         case UserAgeData.AGE_11_20:
            endDate.add(Calendar.YEAR, -11);

            startDate.add(Calendar.YEAR, -21);
         case UserAgeData.AGE_21_30:
            endDate.add(Calendar.YEAR, -21);

            startDate.add(Calendar.YEAR, -31);
         case UserAgeData.AGE_31_40:
            endDate.add(Calendar.YEAR, -31);

            startDate.add(Calendar.YEAR, -41);
         case UserAgeData.AGE_41_50:
            endDate.add(Calendar.YEAR, -41);

            startDate.add(Calendar.YEAR, -51);
         case UserAgeData.AGE_51_60:
            endDate.add(Calendar.YEAR, -51);

            startDate.add(Calendar.YEAR, -61);
         case UserAgeData.AGE_60_PLUS:
            endDate.add(Calendar.YEAR, -61);

            startDate.add(Calendar.YEAR, -121);

      startDate.add(Calendar.DATE, 1);

    * getEndDate: Returns the end date in the range.
    * @return Date The end date.
   public Date getEndDate() {
      // start by verifying the day of year comparisons
      Calendar cal = Calendar.getInstance();

      if (cal.get(Calendar.DAY_OF_YEAR) != endDate.get(Calendar.DAY_OF_YEAR)) {
         // update start and end days
         endDate.add(Calendar.DATE, 1);
         startDate.add(Calendar.DATE, 1);

      return endDate.getTime();

    * getStartDate: Returns the start date in the range.
    * @return Date The start date.
   public Date getStartDate() {

      return startDate.getTime();

So the class is initialized for a type code and, given the type code, it sets up the start and end dates to cover the appropriate range.  Each time the starting and end dates are retrieved, the dates will be checked for possible increment (to keep the ranges intact).


The UserAgeData class uses the DynamicQuery API to count users that fall between the start and end date ranges.

package com.dnebinger.liferay.vaadin.userchart;

import com.liferay.portal.kernel.dao.orm.DynamicQuery;
import com.liferay.portal.kernel.dao.orm.DynamicQueryFactoryUtil;
import com.liferay.portal.kernel.dao.orm.RestrictionsFactoryUtil;
import com.liferay.portal.kernel.exception.SystemException;
import com.liferay.portal.kernel.log.Log;
import com.liferay.portal.kernel.log.LogFactoryUtil;
import com.liferay.portal.kernel.util.PortalClassLoaderUtil;
import com.liferay.portal.model.Contact;
import com.liferay.portal.service.UserLocalServiceUtil;

 * class UserAgeData: A class that contains or determines the user age data percentages.
public class UserAgeData {
   private static final Log logger = LogFactoryUtil.getLog(UserAgeData.class);

   // okay, we need to know when the data was retrieved
   private long timestamp;

   // and we need a container for the data
   private double[] percentages;

   private static final long ONE_DAY_MILLIS = 1000 * 1 * 60 * 60 * 24;

   public static final int AGE_0_10 = 0;
   public static final int AGE_11_20 = 1;
   public static final int AGE_21_30 = 2;
   public static final int AGE_31_40 = 3;
   public static final int AGE_41_50 = 4;
   public static final int AGE_51_60 = 5;
   public static final int AGE_60_PLUS = 6;

   private DateRange[] ranges = new DateRange[7];

    * UserAgeData: Constructor.
   public UserAgeData() {

      percentages = new double[7];
      timestamp = 0;

      for (int idx = 0; idx < 7; idx++) {
         ranges[idx] = new DateRange(idx);

    * getPercentages: Returns either the cached percentages or pulls the count.
    * @return double[] An array of 7 doubles with the age ranges.
   public double[] getPercentages() {
      // if we already have cached users
      if (timestamp > 0) {
         // need to check the timestamp
         long current = System.currentTimeMillis();

         if (current <= (timestamp + ONE_DAY_MILLIS)) {
            // we have values and they are still valid for caching

            if (logger.isDebugEnabled()) logger.debug("Found user data in cache, returning it.");

            return percentages;

      // if we get here then either we have no info or the info is stale.

      long[] counts = new long[7];
      long totalUsers = 0;

      // get the count of users
      for (int idx = 0; idx < 7; idx++) {
         counts[idx] = countUsers(ranges[idx]);
         if (logger.isDebugEnabled()) logger.debug("  " + idx + " has count " + counts[idx]);
         totalUsers += counts[idx];

      // now we can do the math...
      double total = Double.valueOf(totalUsers);

      for (int idx = 0; idx < 7; idx ++) {
         percentages[idx] = 100.0 * (Double.valueOf(counts[idx]) / total);

         if (logger.isDebugEnabled()) logger.debug("Percentage " + idx + " is " + percentages[idx] + "%");

      timestamp = System.currentTimeMillis();

      return percentages;

    * countUsers: Counts the number of users for the given date range.
    * @param range The date range to use for the query.
    * @return long The number of users
   protected long countUsers(DateRange range) {
      if (logger.isDebugEnabled()) logger.debug("Looking for birthday from " + range.getStartDate() + " to " + range.getEndDate() + ".");

      // create a new dynamic query for the Contact class (it has the birth dates).
      DynamicQuery dq = DynamicQueryFactoryUtil.forClass(Contact.class, PortalClassLoaderUtil.getClassLoader());

      // restrict the count so the birthday falls between the start and end date.
      dq.add(RestrictionsFactoryUtil.between("birthday", range.getStartDate(), range.getEndDate()));

      long count = -1;

      try {
         // count the users that satisfy the query.
         count =  UserLocalServiceUtil.dynamicQueryCount(dq);
      } catch (SystemException e) {
         logger.error("Error getting user count: " + e.getMessage(), e);

      if (logger.isDebugEnabled()) logger.debug("Found " + count + " users.");

      return count;

This class will keep a cache of counts that will apply for one day.  After 24 hours the cached values are discarded and the queries will be performed again.

Probably not the best implementation, but I'm not shooting for realtime accuracy, just mostly accurate data suitable for a chart on say a dashboard page.


Next comes the UserChartsUI class that implements the UI for the portlet.

package com.dnebinger.liferay.vaadin.userchart;

import com.vaadin.annotations.Theme;
import com.vaadin.server.VaadinRequest;
import com.vaadin.ui.Alignment;
import com.vaadin.ui.UI;
import com.vaadin.ui.VerticalLayout;
import com.dnebinger.vaadin.highcharts.UserPieChart;

 * Created by dnebinger on 2/6/15.
public class UsersChartUI extends UI {
   private VerticalLayout mainLayout;

   private UserPieChart pieChart;

   private static final UserAgeData userAgeData = new UserAgeData();

   protected void init(VaadinRequest vaadinRequest) {
      // create the main vertical layout
      mainLayout = new VerticalLayout();

      // give it a margin and space the internal components.

      // set the layout as the content area.



      pieChart = new UserPieChart();

      mainLayout.setComponentAlignment(pieChart, Alignment.TOP_CENTER);

      double[] percentages = userAgeData.getPercentages();

      pieChart.updateUsers(percentages[0], percentages[1],percentages[2],percentages[3],percentages[4],percentages[5],percentages[6]);

Nothing special in the UI, it creates a UserPieChart instance and places it in the layout.


The final class is the com.dnebinger.vaadin.highcharts.UserPieChart class.  This is the special integration class joining Vaadin 7 and HighCharts.

package com.dnebinger.vaadin.highcharts;

import com.dnebinger.liferay.vaadin.userchart.UserAgeData;
import com.liferay.portal.kernel.log.Log;
import com.liferay.portal.kernel.log.LogFactoryUtil;
import com.vaadin.annotations.JavaScript;

 * Created by dnebinger on 2/6/15.
@JavaScript({"highcharts-custom.js", "highcharts-connector-nojq.js"})
public class UserPieChart extends AbstractHighChart {
   private static final Log logger = LogFactoryUtil.getLog(UserAgeData.class);
   private static final long serialVersionUID = 7380693815312826144L;

    * updateUsers: Updates the pie chart using the given percentages.
    * @param age_0_10
    * @param age_11_20
    * @param age_21_30
    * @param age_31_40
    * @param age_41_50
    * @param age_51_60
    * @param age_60_plus
   public void updateUsers(final double age_0_10, final double age_11_20, final double age_21_30, final double age_31_40, final double age_41_50, final double age_51_60, final double age_60_plus) {

      StringBuilder sb = new StringBuilder("var options = { ");

      sb.append("chart: {");
      sb.append("plotBackgroundColor: null,");
      sb.append("plotBorderWidth: null,");
      sb.append("plotShadow: false");
      sb.append("title: {");
      sb.append("text: 'Age Breakdown for Registered Users'");
      sb.append("plotOptions: {");
      sb.append("pie: {");
      sb.append("allowPointSelect: true,");
      sb.append("cursor: 'pointer',");
      sb.append("dataLabels: {");
      sb.append("enabled: false");
      sb.append("showInLegend: true");
      sb.append("series: [{");
      sb.append("type: 'pie',");
      sb.append("name: 'Age Breakdown',");
      sb.append("data: [");
      sb.append("['Zero to 10', ").append(format(age_0_10)).append("],");
      sb.append("['11 to 20', ").append(format(age_11_20)).append("],");
      sb.append("['21 to 30', ").append(format(age_21_30)).append("],");
      sb.append("['31 to 40', ").append(format(age_31_40)).append("],");
      sb.append("['41 to 50', ").append(format(age_41_50)).append("],");
      sb.append("['51 to 60', ").append(format(age_51_60)).append("],");
      sb.append("['Over 60', ").append(format(age_60_plus)).append(']');


      // close the options
      sb.append(" };");

      String chart = sb.toString();

      if (logger.isDebugEnabled()) logger.debug(chart);


   protected String format(final double val) {
      String s = String.format("%s", val);

      int pos = s.indexOf('.');

      if (pos < 0) return s + ".0";

      double x = Double.valueOf(Double.valueOf((val * 10.0) + 0.5).longValue()) / 10.0;

      return s.substring(0, pos+2);

This class requires the most explanation, so here we go.

The first thing is the @JavaScript annotation.  This is a Vaadin 7 annotation that is used to inject JavaScript files into the response stream for the portlet.  For this portlet we need the highcharts-custom.js file (this will be the one that you downloaded earlier) as well as a highcharts-connector-nojq.js file which we will add later.

The updateUsers() method takes the 7 percentage values and outputs JavaScript to create and populate an options object, this object will be used by the HighCharts library to render the chart.  I used a simple StringBuilder to build the JavaScript, but a better way would have been to use the org.json.JSONObject class to construct an object and output it as a string (that way you don't have to worry about syntax, quoting, etc.).

The last method, the format() method, is a simple method to output a double as a string to a single decimal place.  Nothing fancy, but it will work.

Adding the JavaScript Files

As previously stated, we need to add two javascript files to the project, but these do not go in the normal places.  Since we used the @JavaScript annotation in the UserPieChart class, we have to put them into the project's src/main/resources folder as the JavaScript files need to be in the class path.

Since our Java class is com.dnebinger.vaadin.highcharts.UserPieChart that has the @JavaScript annotation, our JavaScript files have to be in src/main/resources/com/dnebinger/vaadin/highcharts.  The HighCharts JavaScript file that we downloaded earlier must be copied to this directory named "highcharts-custom.js" (to match the value we used in the @JavaScript annotation).

The second javascript file we will create in this directory is "highcharts-connector-nojq.js".  The contents of this file should be:

window.com_dnebinger_vaadin_highcharts_UserPieChart = function () {

   this.onStateChange = function () {
      // read state
      var domId = this.getState().domId;
      var hcjs = this.getState().hcjs;

      // evaluate highcharts JS which needs to define var "options"

        // update the renderTo to point at the dom element for the chart.
        if (options.chart === undefined) {
            options.chart = {renderTo: domId};
        } else {
            options.chart.renderTo = domId;
        // create the new highcharts object to render the chart.
        var chart1 = new Highcharts.Chart(options);

This is a connector JavaScript function to bridge Vaadin 7 and HighCharts.  Note the name of the function matches the package path and class for the function name, except the periods are replaced with underscores.

Since the current HighCharts Add On relies on jQuery, the connector function that comes with the Add On will just not work in our new implementation for the portal.  This function does not rely on jQuery and will ensure the correct stuff happens.


So far I've presented some code, but it's not all that much.  Some utility classes for date calculations, a wrapper class encapsulating the DynamicQuery access, our necessary UI class for the portlet and a UserPieChart class to encapsulate the HighCharts access.

The results of this code can be seen below:

User Ages Pie Chart

Had I taken a little time to create some users with suitable dates, I could make the chart take on any sort of view.  Since I'm using a stock Liferay bundle with my own extra user account, I have 3 users that must be broken down and displayed.

I think you must agree, however, that all in all this is an extremely simple portlet.  It should be easy to see how you could leverage a number of small Vaadin 7 portlets to create a dashboard page with quite a few graphs on it.

If I were asked to do something like that, I'd make a few changes:

If there were more than say 4 or 5 graphs on the page, I'd put the highcharts-custom.js JavaScript file into the theme.  I'd then remove it from the @JavaScript annotation.  The connector JavaScript would remain, but it is so small and is bound to the chart package/class anyway.

If there are more than two on the page but less than 5, I would at least put the highcharts-custom.js into the portlet javascript resources and pull in using the <header-portlet-javascript /> tag in liferay-portlet.xml.  That would at least allow the portal to import a single copy of the JavaScript rather than multiple copies.  Again it would be removed from the @JavaScript annotation, but the connector JavaScript would remain.

Note: Before anyone asks, I'm not going to be putting the code for this portlet online.  Without the HighCharts JavaScript, it's not really complete and I can't distribute the HighCharts JavaScript myself.  However, all of the code for the portlet is here in the blog and I've left nothing out.  If you have any problems reconstituting the portlet, feel free to ask.


Showing 1 - 20 of 32 results.
Items 20
of 2