000170032 001__ 170032
000170032 005__ 20190316235227.0
000170032 037__ $$aCONF
000170032 245__ $$aTools and Frameworks for Big Learning in Scala: Leveraging the Language for High Productivity and Performance
000170032 269__ $$a2011
000170032 260__ $$c2011
000170032 336__ $$aConference Papers
000170032 520__ $$aImplementing machine learning algorithms for large data, such as the Web graph and social networks, is challenging. Even though much research has focused on making sequential algorithms more scalable, their running times continue to be prohibitively long. Meanwhile, parallelization remains a formidable challenge for this class of problems, despite frameworks like MapReduce which hide much of the associated complexity. We present three ongoing efforts within our team, previously presented at venues in other fields, which aim to make it easier for machine learning researchers and practitioners alike to quickly implement and experiment with their algorithms in a parallel or distributed setting. Furthermore, we hope to highlight some of the language features unique to the Scala programming language in the treatment of our frameworks, in an effort to show how these features can be used to produce efficient and correct parallel systems more easily than ever before.
000170032 6531_ $$aparallel programming
000170032 6531_ $$aprogramming languages
000170032 6531_ $$adomain-specific languages
000170032 6531_ $$atools
000170032 6531_ $$aframeworks
000170032 6531_ $$aproductivity
000170032 6531_ $$aperformance
000170032 700__ $$0242185$$g191683$$aMiller, Heather
000170032 700__ $$0240993$$g172057$$aHaller, Philipp
000170032 700__ $$0241835$$g126003$$aOdersky, Martin
000170032 7112_ $$dDecember 16-17, 2011$$cSierra Nevada, Spain$$aNIPS 2011 Workshop on Parallel and Large-Scale Machine Learning (BigLearn)
000170032 8564_ $$uhttps://infoscience.epfl.ch/record/170032/files/nips2011.pdf$$zn/a$$s95979$$yn/a
000170032 909C0 $$xU10409$$0252187$$pLAMP
000170032 909CO $$qGLOBAL_SET$$pconf$$ooai:infoscience.tind.io:170032$$pIC
000170032 917Z8 $$x191683
000170032 937__ $$aEPFL-CONF-170032
000170032 973__ $$rREVIEWED$$sACCEPTED$$aEPFL
000170032 980__ $$aCONF