views:

1120

answers:

3

I have this task for the project with 4 nested subprojects using Maven:

  1. For each child: jar-up resource directory including project dependencies
  2. Move up to the parent project
  3. With a single command extract all created archives into various remote destinations (full install), that may include http server, app server, file server, etc. (mostly *NIX). Destination is provided on subproject level
  4. It should also be possible to unzip/copy from the individual subproject (partial install)

Files are not Java - mostly various scripts and HTML

I'm looking at the various plugins to help with the task: assembly, dependency, antrun, unzip. Dependency looks promising but I need to unzip not only dependency jars but the (sub)project content as well. Also since I can't really tight the operation to the Maven lifecycle how would I trigger remote install? mvn dependency:unpack? That's not very descriptive or intuitive. Is is possible to create a custom goal (e.g. project:install) without writing a plugin?

Using Maven is company standard so please do not offer alternatives - I'm pretty much stuck with what I have

+3  A: 

Ok, I think the following might do what you need. The drawback of this approach is that there will be an interval between each deployment as the subsequent build is executed. Is this acceptable?

Define a profile in each project with the same name (say "publish"). Within that profile you can define a configuration to use the antrun-plugin to deliver the files with FTP (see below).

In the parent project you'll have a modules element, defining each project as a module. If you run mvn install -P publish, each project will be built in turn with the publish profile enabled, and the final artifact published to the target during the install phase. If you need to deploy additional files, modify the include element accordingly.

Note the parameters for the FTP task have been set as properties, this allows them to be overridden from the command-line and/or inherited from the parent POM.

<profiles>
  <profile>
    <id>publish</id>
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-antrun-plugin</artifactId>
      <executions>
        <execution>
          <id>ftp</id>
          <phase>install</phase>
          <configuration>
            <tasks>
              <ftp action="send" 
                  server="${ftp.host}" remotedir="${ftp.remotedir}" 
                  userid="${ftp.userid}" password="${ftp.password}" 
                  depends="${ftp.depends}" verbose="${ftp.verbose}">
                <fileset dir="${project.build.directory}">
                  <include 
                    name="${project.build.finalName}.${project.packaging}"/>
                </fileset>
              </ftp>
            </tasks>
          </configuration>
          <goals>
            <goal>run</goal>
          </goals>
        </execution>
      </executions>
      <dependencies>
        <dependency>
          <groupId>commons-net</groupId>
          <artifactId>commons-net</artifactId>
          <version>1.4.1</version>
        </dependency>
        <dependency>
          <groupId>ant</groupId>
          <artifactId>ant-commons-net</artifactId>
          <version>1.6.5</version>
        </dependency>
        <dependency>
          <groupId>ant</groupId>
          <artifactId>ant-nodeps</artifactId>
          <version>1.6.5</version>
        </dependency>
      </dependencies>
    </plugin>
    <properties>
      <ftp.host>hostname</ftp.host>
      <ftp.remotedir>/opt/path/to/install</ftp.remotedir>
      <ftp.userid>user</ftp.userid>
      <ftp.password>mypassword</ftp.password>
      <ftp.depends>yes</ftp.depends>
      <ftp.verbose>no</ftp.verbose>          
    </properties>
  </profile>
</profiles>


Update: based on your comment: You could use the dependency plugin to download each dependency, except that a parent can't have a dependency on a child, and it will be built before the child. It would have to be another project. you also need to have somewhere the information for where to deploy them to. At the moment you have the target information in the individual projects so it isn't accessible in the deployer project.

Taking this approach, you can define multiple profiles in the new project, one for each artifact. Each profile defines a dependency:copy execution to obtain the jar and an antrun execution for one of the projects. Common configuration (such as the dependencies for the antrun plugin) can be pulled out of the profiles. Also be aware that the properties will be merged if you define multiple profiles, so yo may need to qualify them with the artifact name, for example ftp.artifact1.host.

<profiles>
  <profile>
    <id>deploy-artifact1</id>
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-dependency-plugin</artifactId>
      <executions>
        <execution>
          <id>copy-dependency</id>
          <phase>prepare-package</phase>
          <goals>
            <goal>copy</goal>
          </goals>
          <configuration>
            <artifactItems>
              <artifactItem>
                <groupId>name.seller.rich</groupId>
                <artifactId>artifact1</artifactId>
                <version>1.0.0</version>
                <type>jar</type>
                <overWrite>false</overWrite>
              </artifactItem>
            </artifactItems>
            <outputDirectory>${project.build.directory}/deploy-staging</outputDirectory>
            <overWriteReleases>false</overWriteReleases>
          </configuration>
        </execution>
      </executions>
    </plugin>
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-antrun-plugin</artifactId>
      <executions>
        <execution>
          <id>ftp</id>
          <phase>install</phase>
          <configuration>
            <tasks>
              <ftp action="send" 
                  server="${ftp.host}" remotedir="${ftp.remotedir}" 
                  userid="${ftp.userid}" password="${ftp.password}" 
                  depends="${ftp.depends}" verbose="${ftp.verbose}">
                <fileset dir="${project.build.directory} includes="deploy-staging/"/>
              </ftp>
            </tasks>
          </configuration>
          <goals>
            <goal>run</goal>
          </goals>
        </execution>
      </executions>
    </plugin>
    <properties>
      <!--if the properties differ between targets, qualify them with the artifact name-->
      <ftp.host>hostname</ftp.host>
      <ftp.remotedir>/opt/path/to/install</ftp.remotedir>
      <ftp.userid>user</ftp.userid>
      <ftp.password>mypassword</ftp.password>
      <ftp.depends>yes</ftp.depends>
      <ftp.verbose>no</ftp.verbose>          
    </properties>
  </profile>
</profiles>
Rich Seller
This should work. The profile is what I've been missing all together. Here's another idea that should help with "hiccups" - instead of defining subprojects as modules define these as dependencies in the parent POM and extract artifact JARs to their destinations using dependency plugin. Will that work?
DroidIn.net
I forgot to mention that in our enterprise we use NAC to be able to map/access any server just as mapped directory (both Win/*NIX) so there's really no need for FTP or SSH
DroidIn.net
well in that case, the dependency-plugin's copy goal is all you need, just modify th outputDirectory property to the path of the target
Rich Seller
Fantastic! Thanks for your time and advise Rich, once I'll finish doing what I'm doing I'll update the post, but I'm accepting your suggestion as a solution
DroidIn.net
you're welcome
Rich Seller
A: 

I would look at using the maven-assembly-plugin to do this.

Something like this can be used to grab the files from the child projects and stuff them in output directories.

<assembly>
  <id>xyzzy</id>
  <formats>
   <format>zip</format>
  </formats>
  <fileSets>
   <fileSet>
      <directory>../subproject1/target/</directory>
      <outputDirectory>/foo</outputDirectory>
      <includes>
        <include>*.jar</include>
      </includes>
   </fileSet>
   <fileSet>
      <directory>../subproject1/target/html-output/</directory>
      <outputDirectory>/foo</outputDirectory>
      <includes>
        <include>*.html</include>
        <include>*.js</include>
        <include>*.css</include>
      </includes>
    </fileSet>     
    <fileSet>
      <directory>../subproject2/target/</directory>
      <outputDirectory>/bar</outputDirectory>
      <includes>
        <include>**/**</include>
      </includes>
      <excludes>
        <exclude>**/*.exclude-this</exclude>
      </excludes>
    </fileSet>
  </fileSets> 
</assembly>
sal
As I understand - assembly plugin will create some sort of archive (jar, war, zip) - that doesn't work for me - I need actual files/folders to be copied to the destination. Actually - maven-dependency plugin seems to be better suitable since it can "unpack" but then I'm stuck since I cannot define subprojects both as modules and dependencies. So the route I'm going is actually after Rich's suggestions - define profile and copy files during install phase.
DroidIn.net
<format>dir</format> would create a directory
sal
A: 

Maven is not really designed to deploy jars to a remote location; its main use is compiling and packaging artifacts. The assembly and dependency targets are primarily used to gather dependencies and files to package into an artifact.

Having said that, maven does have a deploy goal which uses a component called wagon. This is primarily intended to deploy to a maven repository. There is a plugin called Cargo that can be used to deploy artifacts to a remote server, but that doesn't explode the jar contents by itself (it relies on the target app server to do all that). You might be able to extend the Maven Wagon functionality yourself.

Also, it is possible to package a custom lifecycle, but that is getting into some pretty low level maven mojo (pun intended).

Ken Liu
I ended up using profiles and antrun - the best of both worlds!
DroidIn.net
great! I guess Antrun is the way to sidestep the whole Maven requirement ;)
Ken Liu