Combination View Flat View Tree View
Threads [ Previous | Next ]
toggle
Rajneesh Kumar
how to configure shared bean context using spring3 and liferay6
August 5, 2011 12:47 PM
Answer

Rajneesh Kumar

Rank: New Member

Posts: 18

Join Date: August 5, 2011

Recent Posts

Hi all,
For my new assignment I am using liferay. Spring mvc forportlet development and spring DI for custom service and Dao layer. Actually I will have 20 portlets (which are going to be developed in Spring mvc) on one web page. in liferay, all portlets are deployed as war(individual web application means 1 porlet = 1 .war) files in tomcat. I am using common service and dao layer on top of portles rather than designing & developing inside the individual portlet. To achive this, I packed all the service, dao classes and config files (application context named app_context.xml (application_context), dao_context.xml and service_context.xml under a folder named "spring_resources") inside one jar. now I have to load/configure this jar in shared fashion so that all the controllers residing inside the war files can inject services packed in jar. I have put this jar inside ROOT/WEB-INF/lib of tomcat. but loading fails when tomcat tries to bind a service to a portlet controller during loading of server.
It means application context (app_context.xml ) could not be shared among the portlets (war files.).

Any help?
David H Nebinger
RE: how to configure shared bean context using spring3 and liferay6
August 5, 2011 2:10 PM
Answer

David H Nebinger

Community Moderator

Rank: Liferay Legend

Posts: 11510

Join Date: September 1, 2006

Recent Posts

Yes, scratch this implementation as it will have runtime issues.

Each war will have it's own instance of the spring context, and across 20 wars you will have 20 instances of the context loaded into memory at one time, all clones of each other. During tomcat startup, all 20 war files are handled separately, thus increasing your startup and shutdown times. And finally, each war will end up w/ it's own data connection pool, so you will conceivably have 20 active connections to a database when you might otherwise get away with one or two.

Your better plan is to combine all of these into a single plugin project, deployed as a single war file. This way they will all share the same, single, spring context rather than 20 clones. Tomcat will only be loading a single war, reducing your startup/shutdown time. And since the portlets all share the same context, they can share the same connection pool effectively reducing your resource requirements significantly.

Trust me, I'm telling you this from experience. Don't go the separate project path...
Rajneesh Kumar
RE: how to configure shared bean context using spring3 and liferay6
August 5, 2011 10:37 PM
Answer

Rajneesh Kumar

Rank: New Member

Posts: 18

Join Date: August 5, 2011

Recent Posts

But how it is possible to develop all the portlets in one plugin project??
However rest of the world also asked me to clubb all the portlets in one web project rather than puting them in seperate war files but I have never seen any such example and related documentation provided by liferay.

Can you please describe in detail.
David H Nebinger
RE: how to configure shared bean context using spring3 and liferay6
August 7, 2011 3:16 PM
Answer

David H Nebinger

Community Moderator

Rank: Liferay Legend

Posts: 11510

Join Date: September 1, 2006

Recent Posts

Yeah, it's really easy. In portlet.xml, you just list all of your portlets rather than just one. In liferay-portlet.xml, do the same thing. In web.xml, if you need any portlet-specific stuff in there (like I do with JSF), you include each portlet also.

There's no magic behind it, and in fact I think the Liferay IDE will actually help you do this (right click on an existing portlet plugin and choose 'add portlet' should add an additional portlet to the plugin).
Rajneesh Kumar
RE: how to configure shared bean context using spring3 and liferay6
August 8, 2011 9:21 AM
Answer

Rajneesh Kumar

Rank: New Member

Posts: 18

Join Date: August 5, 2011

Recent Posts

Thanks David for your suggestion,

By this way, I can see it will solve all the resource sharing problems.But by this way can we do all portlet related stuffs properly? I mean those portlet would be JSR 168 & JSR 286 compliant or not ?

I have tried this design (putting all the portlets in one project) its working fine if I use default liferay MVC. But couldn't configure properly in case of Spring MVC.
Problem Description is below:
As per standered design practice using Spring MVC we will have to declare web application context for each portlet like <portlet-name>context.xml <portlet-name2>context.xml.
this leads problem and not configuring portlets correctly.
So 1Q: shuld I have to declare one webapplicationcontext for every portlet?
2Q: If we can put all in one single web application context then what will be the name so that liferay will load it autometically?
3Q: How to configure PortletModeHandlerMapping instance for multiple controllers for a perticular entry key for example <entry key="view"> in this case
. Right now it is like below:
<bean id="portletModeHandlerMapping"
class="org.springframework.web.portlet.handler.PortletModeHandlerMapping">
<property name="portletModeMap">
<map>
<entry key="view">
<ref bean="oMSRegularViewController" />
</entry>
</map>
</property>
</bean>
So it will send incoming request to this controller for view of a perticular portlet (which use oMSRegularViewController for view) not for all. And in my case there will be 20 portlets going to load with 20 different controllers.

Please help!!
David H Nebinger
RE: how to configure shared bean context using spring3 and liferay6
August 9, 2011 6:53 AM
Answer

David H Nebinger

Community Moderator

Rank: Liferay Legend

Posts: 11510

Join Date: September 1, 2006

Recent Posts

Combined portlets in a single distributable is compliant to both 168 and 286. Anything you can do in single plugins, you can do in a combined plugin.
Rajneesh Kumar
RE: how to configure shared bean context using spring3 and liferay6
August 22, 2011 6:35 AM
Answer

Rajneesh Kumar

Rank: New Member

Posts: 18

Join Date: August 5, 2011

Recent Posts

Thanks David !!! emoticon
Angelos Varvitsiotis
RE: how to configure shared bean context using spring3 and liferay6
September 4, 2011 11:10 PM
Answer

Angelos Varvitsiotis

Rank: New Member

Posts: 9

Join Date: July 26, 2011

Recent Posts

I am facing a similar issue: I am about to start developing a complex data model, consisting of at least twenty services, a good deal of portlets (say, around fifty), and several tens of tables pointing in one way or another to various legacy databases and schemas. I am thinking of using plugins and ServiceBuilder as my implementation path. However, I am heavily concerned about whether such a beast can be maintained as a single Liferay plugin.

Having a single plugin for all the above means that, say, for a single column addition to the data model, I will have to rebuild and reload/redeploy at run-time all services and portlets contained in that plugin -- effectively, my whole application. For several reasons, I would like to avoid this method. So, I was thinking of dividing my data model into "areas", where each "area" would relate to a family of closely related ServiceBuilder services, and split each of these families of services, along with related portlets, into separate Liferay plugins. Naturally, this strategy would introduce inter-plugin dependencies, both compile and run-time..

From the reading and searching I 've been doing so far, it seems that the suggested (or perhaps the only possible?) way of doing this in Liferay and ServiceBuilder is by copying the .jar file for each service into every other dependent service. Now, however, such a strategy would have the following drawbacks: (a) it would create a source maintenance hell, and (b) it would create a separate Spring context for each plugin, where the same beans would be replicated over and over, devouring my server's resources.

This seems to me like a common problem many people must have faced before. So, my (possibly naive) question is: does Liferay provide a way of auto-wiring Service beans across plug-ins at run-time? This would require a common Spring ApplicationContext shared by all my different plugins. Any clues on whether that's possible in Liferay? Any other suggested strategies for tackling the problem are welcome, too.
Angelos Varvitsiotis
RE: how to configure shared bean context using spring3 and liferay6
December 11, 2012 11:31 PM
Answer

Angelos Varvitsiotis

Rank: New Member

Posts: 9

Join Date: July 26, 2011

Recent Posts

Since noone else has replied here, and after more than a year from our original quest for a development strategy, we have come up with the following answers, just in case anyone wants them.

The startegy [initially "startegy" was a typo -- I meant "strategy" -- , but it expresses very well what I am about to say, so I left it in :-)] that seemed to suit us best in the beginning was to develop all of our portlets and the ServiceBuilder services upon which the portlets depended into a large, single portlet plugin. As it could be expected, this "startegy" had its limits. After a year or so of development, we were at a situation where each re-deployment took like fifteen minutes, so it was impractical.

At that point, we decided to split our plugin into a servicebuilder hook plugin (services only, no portlets) and one or more portlet plugins (portlets only, no servicebuilder).

Developer Studio (or Liferay IDE) proved to be a good tool to begin with. There is an option to let Liferay portal know about the dependency between plugins and at the same time configure Eclipse to reference the JAR produced at the hook plugin from within the portlet plugin by editing liferay-plugin-package.properties. See this presentation by Olaf Kock for details. The first thing that this accomplishes is that, during deployment, it instructs Liferay portal to delay the deployment of the dependent portlet plugin(s) until the providing hook plugin has been deployed. The second thing that it does it that it auto-adjust the Java build path parameters in Eclipse, so that the JAR produced by ServiceBuilder within the hook plugin is referenced correctly by the portlet plugin (without copying!).

Then, a number of subtle points came into play. It was good to find out that my statement in my previous post about multiple Spring contexts is wrong. There is a single context in which the implementations live, and this is the context of the plugin that contains the XXXImpl classes. Thus, things like the cache etc. only live once in the application server. This felt reassuring.

Let me be a bit more descriptive here. When running servcebuilder (I 'll consider a service.xml with a single entity called "Entity" and two columns, named entityId and entityData of type long and String, respectively), there are two filesystem hierarchies to be considered, namely docroot/WEB-INF/src and docroot/WEB-INF/service. The first contains (among others) implementation classes for the model (EntityImpl.java, EntityLocalServiceImpl.java, EntityPersistenceImpl.java, EntityFinderImpl.java, etc.) and the second contains (among others) classloader proxies and interfaces (EntityClp.java, EntityLocalServiceUtil.java, EntityPersistence.java, EntityFinder.java, etc.). Notice that, package-wise, the two filesystem hierarchies contain classes in the same packages (e.g., com.test.service). The "service JAR" is constructed only from the files in the "service" hierarchy. Thus, if one copies (or references) this into another plugin, what happens is that the dependent plugin only sees the interfaces and the classloader proxy classes. Not the implementations themselves.

What are classloader proxies? They are just beans (created and populated with values with the aid of Spring), which contain the same methods like the original implementations. E.g., if your Entity.xml contains two fields a long entityId and a String entityData, your EntityImpl.java (extends a class that) provides the methods getEntityId() and getEntityData(). However, the EntityImpl.class does only exist in the (classloader) context of your hook context. In the context of your dependent portlet plugin, you will not be using EntityImpl; instead you will be using the same methods from within EntityClp.class. Notice that, for a given database row of your Entity entity, a single instance of EntityImpl should exist in the whole application server, however, multiple instances of EntityClp may exist that will "point" to that EntityImpl. Does it sound any clearer now?

If you take a look at its source, EntityClp.java, the EntityClp class contains private member variables entityId and entityData, getters and setters for these, and some methods to make it behave a little bit like EntityImpl.java. Then, Spring takes care of the rest. A service called BeanLocator takes action during the deployment of your dependent portlet plugin, and locates where the implementation classes are. Then, Liferay uses Spring to "link" or "bridge" (among others) your EntityImpl.class (that lives in the hook plugin's context) with EntityClp.class (that lives within the context of your dependent portlet plugin). I have mentioned only EntityImpl and EntityClp here, but the same things hold for EntityLocalServiceImpl and EntityLocalServiceClp (and for the non-Local services, too).

ServiceBuilder uses reflection to create the source of the EntityClp files. This has some interesting consequences. Among others, any methods that you created in EntityImpl.java that are not default accessors (getters/setters) of entity-derived class variables (here, entityId and entityData), are replaced by a method that throws an UnsupportedMethodException. The result is that you cannot use such methods non-locally, i.e. from your portlet plugin. You can, however, use them locally, because in that case you have direct access to the EntityImpl class. LPS-11925 describes a workaround to overcome this problem, but this has several limitations. The main issues seem to be that other classes and interfaces that you use inside non-default accessor methods in EntityImpl.java (this could be for example java.util.List, or whatever else) (a) may be unavailable in the dependent portlet's classloader (see the discussion on the MyTextUtil class below) and (b) require imports that cannot be generated (reflection cannot find the imported classes in a class file). Why is (a) an issue? Because the workaround of LPS-11925 replicates the code that you have written in your EntityImpl.java source into EntityClp.java. However, remember that EntityClp operates under a different classloader than EntityImpl. Thus the workaround of LPS-11925 is something that you must do and maintain on your own, at your own risk, and Liferay will not support it. For some weird reason that eludes me now, the exact same issues seem to be no problem for custom methods created in EntityLocalServiceImpl.java, but I haven't looked that deep into the source to find out why. Maybe later.

Now, there is another subtle difference between what happens if one moves the service JAR into lib/ext and what happens if one copies it (or references it) into another plugin. In the first case, all dependent plugins (including the provider plugin) use the same interface for e.g. Entity (the interface generated for EntityImpl). Instead, in the second case, each plugin uses a different interface for Entity (because each plugin uses its own classloader to load the Entity interface, although the bytecode loaded by each of those classloaders is identical). I do not know if BeanLocator behaves differently in each case, but word-of-mouth is telling it may. Moving the service jar into lib/ext may buy you some advantages, like linking to the impl classes directly (I don't know if this is indeed so - I said it "may" buy you advantages, not that it "will"). I do not know, I haven't tested it and I haven't looked at the source.

In our case, we deemed that moving the service jar into the application server's lib/ext would introduce serious deployment (and operation, and dependency, and version management) problems. One would need to remember not only to deploy the hook plugin to each server, but also to stop it, copy the service jar (which would have to be removed manually and moved somewhere out of the docroot/WEB-INF/lib folder) into the server's lib/ext and then the server would have to be restarted. Not too good for a production environment. Thus we worked around the problem of LPS-11925, using some of the method(s) descibed therein (we were only interested in the EntityImpl/EntityClp patches).

Another subtle point was the custom classes upon which the custom EntityImpl methods are based. For example, if EntityImpl.java imports and uses a custom class org.my.mypackage.MyTextUtil, then (a copy of) this package hierarchy would need to be packaged with the service JAR. Please note that this will be a copy, therefore no singleton classes etc. should be ever used. In our case, org.my.mypackage.MyTextUtil is a utility class that contains static methods, so just using a copy of it would present no problems. Under these constraints, we refactored the code so that org.my.mypackage.MyTextUtil became com.test.service.util.MyTextUtil (where com.test.service is the ServiceBuilder-generated package path for the service classes) and placed the code unde the "service" folder. Thus, ant build-service now packages our MyTextUtil class along with the service jar, and any other portlet can use it.

I believe I have said enough to answer my initial questions. I hope that these will be useful for someone who, like us, will face the same problems, having to build large, inter-dependent ServiceBuilder layers and dependent portlets. I have placed the answers here, in the hope that they will be useful.

Angelos