Friday, 15 June 2012

apache spark - How to build deb package to contain single assembly jar with sbt-assembly and sbt-native-packager? -



apache spark - How to build deb package to contain single assembly jar with sbt-assembly and sbt-native-packager? -

is possible utilize sbt-assembly , sbt-native-packager plugins create java application archetype installation instead of having project jar , dependencies in <app>/lib contains assembly jar?

i've built spark application , want add together assembly context rather adding each jar individually.

edit: need build deb packages deployment. want deb bundle contain assembly not project & dependent jars.

the filesystem layout should be

<install_dir> bin appname conf application.conf lib appname-assembly.jar

sbt-native-packager adds symlink /usr/bin convenient not necessary.

this possible native package. total illustration can found on github have alter mappings , scriptclasspath

your build.sbt should contain next parts

// assembly settings assemblysettings // specify name our fat jar jarname in assembly := "assembly-project.jar" // using java server application packagearchetype.java_server maintainer in linux := "nepomuk seiler <nepomuk.seiler@mukis.de>" packagesummary in linux := "custom application configuration" packagedescription := "custom application configuration" // removes jar mappings in universal , appends fat jar mappings in universal := { // universalmappings: seq[(file,string)] val universalmappings = (mappings in universal).value val fatjar = (assembly in compile).value // removing means filtering val filtered = universalmappings filter { case (file, name) => ! name.endswith(".jar") } // add together fat jar filtered :+ (fatjar -> ("lib/" + fatjar.getname))

sbt apache-spark sbt-native-packager sbt-assembly

No comments:

Post a Comment