cloud_sig
LOGS
19:01:37 <rbergeron> #startmeeting Cloud SIG
19:01:37 <zodbot> Meeting started Fri Jun  1 19:01:37 2012 UTC.  The chair is rbergeron. Information about MeetBot at http://wiki.debian.org/MeetBot.
19:01:37 <zodbot> Useful Commands: #action #agreed #halp #info #idea #link #topic.
19:01:41 <rbergeron> #meetingname Cloud SIG
19:01:41 <zodbot> The meeting name has been set to 'cloud_sig'
19:02:24 <rbergeron> #chair maxamillion
19:02:24 <zodbot> Current chairs: maxamillion rbergeron
19:02:29 <rbergeron> #chair gholms
19:02:29 <zodbot> Current chairs: gholms maxamillion rbergeron
19:03:41 <gholms> rbergeron: Quit slacking!
19:03:44 <gholms> #topic Roll call
19:03:49 <gholms> Who's here today?
19:04:04 * maxamillion is here
19:04:13 <maxamillion> I'm a chair? ... that's scary
19:05:00 <rbergeron> i'm here.
19:05:09 <rbergeron> gholms: :)
19:05:13 <gholms> :D
19:05:25 <rbergeron> feels like the week after release ;)
19:05:55 * mdomsch 
19:06:03 <gholms> Exhausted?
19:06:55 <rbergeron> yes.
19:06:57 <rbergeron> still.
19:07:00 <rbergeron> mdomsch: hi there :)
19:07:02 <rbergeron> jdarcy :)
19:07:37 * jdarcy :)
19:08:04 <rbergeron> okay, we'll start!
19:08:09 <rbergeron> #topic Plans for F18
19:08:27 <mdomsch> rbergeron: I thought you found a minion to handle feature process now, no?
19:08:28 <rbergeron> So i plugged this a bit in the agenda - I mostly want to see if anyone's got anything interesting they're either doing, planning, or would love to see tihs release
19:08:34 <rbergeron> mdomsch: HA HA HA HAHAAHAHAHAHAH
19:08:36 <rbergeron> WHEW
19:08:37 <rbergeron> good one
19:08:48 <rbergeron> mdomsch: there's a req.
19:08:49 <mdomsch> :-)
19:09:09 <rbergeron> mdomsch: these things move at, you know, the speed of wood petrification at red hat :)
19:09:19 * mdomsch wants to see CloudStack finished getting into the distro
19:09:38 <gholms> SeTo see for F18?
19:09:41 <mdomsch> and wants us to think hard about creating separate EPEL-like repos for each cloud management system
19:09:47 <rbergeron> mdomsch: indeed.
19:10:47 <mdomsch> I tried to get FESCO town hall candidates to bite at that yesterday, with no luck
19:11:24 <rbergeron> mdomsch: yeah, i think fesco as a whole is still sort of like... cloud, that thing
19:11:37 <rbergeron> i don't think anyone has any concretely formed opinions
19:11:46 <rbergeron> gholms: seto see?
19:11:51 <gholms> What?
19:11:56 <gholms> Oh
19:12:04 <rbergeron> mdomsch: you also didn't get a lot of bites on the mailing list either, iirc with that discussion
19:12:16 <mdomsch> yeah, not so much
19:12:16 <rbergeron> #idea cloudstack finished for F18
19:12:30 <gholms> rbergeron: That was me typing, hitting backspace a bunch of times, typing a new sentence, and having the wireless drop.
19:12:31 <rbergeron> #idea think about creating separate epel-like repos for each cloud management system
19:12:38 <rbergeron> gholms: epic :)
19:12:55 <rbergeron> #idea Euca for F18 too :)
19:13:02 <gholms> :)
19:13:09 <rbergeron> gholms: any idears?
19:13:29 * gholms hrms
19:13:32 <maxamillion> mdomsch: I might have missed something, but what's the motivation for the different repos?
19:14:01 <gholms> maxamillion: EPEL requires more long-term stability/support than any extant cloud platform is able to provide.
19:14:18 <mdomsch> maxamillion: original request was for a user running OpenStack Diablo, got forcably upgraded to Essex when the bits landed in EPEL
19:14:32 <gholms> mdomsch: You mean something more formal than repos.fp.o?
19:14:42 <mdomsch> gholms: ideally, yes
19:14:45 <maxamillion> mdomsch: ahhhh, ok
19:15:33 * rbergeron hands gholms a dime for his use of the word extant
19:15:36 <rbergeron> ;)
19:15:41 <gholms> Heh
19:15:46 <maxamillion> mdomsch: maybe branch openstack-swift-diablo and openstack-swift-essex separate? (as well as for all the other stuff)
19:16:00 <gholms> That was certainly proposed.
19:16:09 <gholms> But it means a new review of every package for every series.
19:16:13 <mdomsch> I don't want to solve it here
19:16:19 <maxamillion> gholms: oh right
19:16:28 <maxamillion> mdomsch: right, sorry ... just a random thought :)
19:16:36 <mdomsch> I want us to think of the experience we want our end users to have
19:16:42 <mdomsch> and then reverse engineer from there
19:17:00 <maxamillion> rgr
19:17:36 <mdomsch> I'd like to see the SIG make dgilmore less a SPOF
19:17:43 <rbergeron> mdomsch: are there any specific fesco people that might be interested in helping with such a thing? or do we have the brainpower here to do it on our own (i suspect we do)
19:17:44 <gholms> Yes, please
19:18:18 <rbergeron> re: the packaging stuf
19:18:19 <rbergeron> stuff
19:18:26 <gholms> #idea Recruit volunteers to build Fedora cloud images
19:18:34 <mdomsch> rbergeron: unclear - likely.  Just like when EPEL launched I hope
19:19:56 * rbergeron nods
19:20:20 <rbergeron> #idea define/envision end-user experience(s)
19:20:39 <mdomsch> Cloud SIG FAD? :-)
19:20:50 <gholms> Yes, please!
19:21:00 <gholms> I kind of want to do one of those late this summer.
19:21:01 <rbergeron> #idea Cloud SIG FAD (with lots of yes, plz)
19:21:14 <rbergeron> gholms: by late summer please say early fall? lol
19:21:16 <mdomsch> I know, let's do it in San Diego the last week of August
19:21:23 <rbergeron> mdomsch: LOL
19:21:39 <rbergeron> mdomsch: i think we could probably do something, but i don't know if we could actually stay focused there :)
19:21:48 <mdomsch> j/k
19:21:56 <gholms> tdawson mentioned that a bunch of the openshift people would probably be interested in something around August.
19:22:02 <rbergeron> gholms: ahhh
19:22:04 <rbergeron> hrmmm
19:22:21 <rbergeron> as in, "once they have it packaged"?
19:22:35 <rbergeron> or what's the august... dependency, i guess
19:22:39 <gholms> Not sure, he just said "in a couple months."
19:22:46 <gholms> s/,/;/
19:22:48 <maxamillion> August sounds good for me, I have a couple weekends taken up in August, but other than that
19:23:02 <maxamillion> (from an openshift standpoint) :P
19:23:14 <gholms> I should probably check with $dayjob.  :)
19:23:15 <rbergeron> gholms: have ou poked anyone else about it?
19:23:32 <rbergeron> maybe a mail on the mailing list would be useful - esp. if we have some specific problems defined, etc.
19:23:36 <rbergeron> or tasks defined :)
19:23:40 <gholms> rbergeron: It was part of a blog post that landed on planet.fp.o.  Does that count?
19:23:54 <zoglesby> We did a Docs FAD at OLF in September last year, that work really well for us (sorry to interrupt)
19:24:27 <rbergeron> god, summer, my kids are asking inane questions, sorry
19:24:45 <gholms> There's a fairly nice co-working space in Durham that should be within driving distance of people in the RDU area.
19:25:04 <rbergeron> gholms: mailing list might be useful nonetheless :)
19:25:06 <gholms> And then agrimm and gregdek would have no choice but to attend. :)
19:25:08 <gholms> Indeed.
19:25:10 <rbergeron> gholms: LOL
19:25:51 <rbergeron> and I think ew have lots of people within driving range anyway - there are some cloudstack peeps in the "area" (10m to 4h drive), openshift folks i suspect, etc.
19:26:46 <rbergeron> gholms: if you want to do that, i can point you at the fad planning pages - i think we really just need to brainstorm on it a bit between "what's everyone's availability" and "what do we want to solve"
19:27:08 <gholms> https://s3.amazonaws.com/devrandom/imgs/wfm.png
19:27:33 <maxamillion> gholms: awesome
19:27:45 <rbergeron> awesome
19:28:04 <rbergeron> #action gholms to stir up fad planning action on the cloud sig mailing list
19:28:19 <rbergeron> perhaps a wiki page of just ideas and the who and the whens possibilities would be a good spot to get started
19:28:23 <rbergeron> err, good place
19:28:23 <rbergeron> damnit
19:28:29 <gholms> rbergeron: Don't summon him!
19:28:31 * rbergeron hates that she hails tom unnecessarily all the time :)
19:28:38 * spot emerges from the darkness
19:28:43 <gholms> Oh noes!
19:28:44 <rbergeron> i think he's going to make me start providing a beer every time i do that
19:28:50 <spot> BEEEEEEER
19:29:06 <rbergeron> which means a lot of beer for him next weekend
19:29:10 <rbergeron> anyway: OTHER IDEARS?
19:29:39 <gholms> Well now we have to start a beer fund for rbergeron's mishaps.
19:29:42 <rbergeron> I know the "wouldn't it be nice if i could build a shift on top of a stack on top of fedora" is looming there.
19:29:45 <gholms> Or something.
19:29:50 <dgilmore> help in testing out live-media-creator to make images would rock
19:30:24 <rbergeron> #idea help in testing out live-media-creator to make images would rock
19:30:26 <gholms> rbergeron: I think "making $paas work on $iaas" would be a great thing to do at a FAD and also in general.
19:30:36 <gholms> Hmm...
19:30:44 * rbergeron hands gholms the idea button
19:30:44 <mdomsch> how is that opensuse studio - like - project coming along?  "here's a kickstart file, give me a cloud instance please"
19:30:59 <mdomsch> gholms: >1 day :-)
19:31:00 <rbergeron> mdomsch: oh, you know, in my copious spare time i've gone nowhere with it
19:31:20 <mdomsch> rbergeron: there was a planet post just last week about it - some intern...
19:31:21 <gholms> #idea Make $PaaS work on $IaaS
19:31:26 <rbergeron> mdomsch: unless you're referring to just boxgrinder alone
19:31:37 <gregdek> +1 to gholms' idea.
19:31:42 <rbergeron> mdomsch: ah, yes, i think i saw that at some point
19:31:58 <rbergeron> though i cna't remember if it's actually intern-y or more like... GSOC
19:32:02 <gregdek> agrimm favors boxgrinder, fwiw.
19:32:14 <rbergeron> gregdek: i think a couple projects favor it
19:32:25 <gregdek> (actually he favors ami-creator.)
19:32:35 <rbergeron> or at least recommend it here and there
19:32:54 <rbergeron> mdomsch: would you be interested in reaching out to $person and inviting them to the cloud sig mailing list? :)
19:33:26 <mdomsch> rbergeron: I'm looking for the post
19:34:08 <rbergeron> other ideas! or I'm movin' on to poking at people :)
19:34:19 <rbergeron> people/projects
19:34:55 * ke4qqq shows up very late
19:35:49 <rbergeron> ke4qqq: quick! ideas for f18
19:35:50 <rbergeron> go :)
19:35:57 <rbergeron> mdomsch already covered packaging CS
19:35:59 <rbergeron> :)
19:36:25 <rbergeron> ....or not. okay... moving on
19:36:30 <rbergeron> unless someone objects :)
19:36:36 <rbergeron> #topic Gluster
19:36:44 <rbergeron> jdarcy: HEY! i hear y'all just had a release
19:37:52 * rbergeron taps her microphone
19:38:07 <rbergeron> #link http://www.gluster.org/2012/05/introducing-glusterfs-3-3/
19:38:17 <rbergeron> okay, i'll just plug that for you guys.
19:38:19 <rbergeron> and move onwards
19:38:21 <jdarcy> rbergeron: Yes, we did.  :)
19:38:24 <rbergeron> oh.
19:38:28 <jdarcy> Now we get to do all the interesting stuff.
19:38:28 <rbergeron> Anything to add to that?
19:38:33 <rbergeron> do tell.
19:38:47 <jdarcy> Well, there's just a ton of stuff that got blocked behind 3.3.
19:39:03 <jdarcy> Biggest item for me is multi-way active/active replication (the "and a pony" kind).
19:39:23 <jdarcy> That should be good for cloudy folks because migration in/out of clouds is getting to be a hot issue.
19:40:14 <jdarcy> Kaleb is tearing his hair out (or would be) trying to deal with GlusterFS 3.2 + HekaFS vs. GlusterFS 3.3 vs. RHS vs. EPEL etc.
19:40:26 <rbergeron> jdarcy: aye :)
19:40:44 <jdarcy> Thank God we have him.
19:41:43 <rbergeron> indeedy
19:41:53 <rbergeron> okay. anything else?
19:42:01 * rbergeron will move on if not :)
19:42:10 <jdarcy> Nothing for me.
19:42:12 <rbergeron> #topic openstack
19:42:18 <rbergeron> rustlebee: yo, if you're still about
19:42:31 * rbergeron is pretty sure there's no huge news here atm
19:43:05 <rbergeron> okay! we'll assume that's the case
19:43:07 <rbergeron> #topic openshift
19:43:14 <rbergeron> maxamillion: tell me you'rehere :)
19:43:58 <rbergeron> ke4qqq: if you're here maybe you could pipe in about the kind brave soul who is working on chef packaging also :) if there's anything to say there about it
19:44:29 * rbergeron pouts
19:44:35 <ke4qqq> some intrepid victim has undertaken chef - I've volunteered to sponsor him - and pushed him to doing informal reviews on openshift
19:45:01 <rustlebee> that was great to see
19:45:22 <gregdek> chef? oo!
19:45:43 <ke4qqq> rustlebee: the new nick is even more disturbing than the drumkilla > russellb
19:45:43 <maxamillion> rbergeron: sorry, have like 3 meetings going on right now
19:45:47 <rbergeron> #info a kind-hearted soul has undertaken chef packaging, ke4qqq has volunteered to sponsor him
19:45:54 <rbergeron> maxamillion: ah, gotcha
19:46:10 <rbergeron> rustlebee: i stand yet again in my "i totally like it" corner :)
19:46:17 <rustlebee> :)
19:46:35 <rbergeron> #info (and robyn is bad at meetings, chef packaging has nothing to do with openshift, just randomly wandering through topics)
19:46:48 <rbergeron> gregdek: yes, ooooo!
19:46:51 * jdarcy <- even worse at meetings
19:47:54 <maxamillion> rbergeron: so, openshift has some fun challenges going on right now because we are current based on ruby 1.8 and we're working to get things in line with ruby 1.9 for the F18 openshift feature
19:48:21 <rbergeron> ahh, yes. ;)
19:49:41 <rbergeron> maxamillion: well, people are looking forward to it :)
19:49:53 <maxamillion> rbergeron: we're also in the process of getting all our m-collective scaling code open sourced and up on github and out to the community in a consumable form ... details on the state of that are on our FAQ https://openshift.redhat.com/community/wiki/faq-frequently-asked-questions under "how do I scale to more than one node with OpenShift Origin?"
19:50:04 <maxamillion> rbergeron: we're looking forward to it! :D
19:50:08 <maxamillion> oh!
19:50:24 <rbergeron> r-e-o
19:50:26 <rbergeron> mmmm
19:50:27 <rbergeron> oreos
19:50:34 * rbergeron waits for maxamillion's oh!
19:50:43 <maxamillion> we'll be updating our OpenShift Origin LiveCD Fedora Remix (LONGEST NAME EVAR)
19:50:48 <maxamillion> ohhhhhh!!!!
19:50:54 <maxamillion> o.O;
19:51:10 <mdomsch> OpenShift Origin CloudStack oVirt LiveCD Fedora Remix
19:51:13 <rbergeron> updating it to... be based on a beefy miracle?
19:51:22 <rbergeron> mdomsch: you forgot euca
19:51:40 <rbergeron> lol
19:51:51 <maxamillion> oh! also, I've launched a Fedora Nightly (only Fedora 16) right now for the openshift origin bits that are hosted on the github openshift/crankcase repo --> http://mirror.openshift.com/pub/crankcase/nightly/fedora-16/
19:52:07 <rbergeron> maxamillion: wow, you've been busy :)
19:52:08 <maxamillion> note however that there are no builds for last night ... that was my mistake :(
19:52:13 <maxamillion> rbergeron: certainly so :)
19:52:26 <rbergeron> that is awesome though
19:53:44 <rbergeron> okay. i'm moving on!
19:53:48 <rbergeron> #topic Open Floor
19:53:55 <rbergeron> now's the time to bring up what i forgot!
19:54:07 * mdomsch got s3-mirror-us-west-1 functional on Wednesday
19:54:16 <mdomsch> now both us-east-1 and us-west-1 have private S3 mirrors
19:54:27 <mdomsch> and I've started building one for us-west-2
19:54:55 * mdomsch _needs_ someone who knows about S3 log files, processing, and reporting, to do something useful with the bucket logs
19:54:59 <rbergeron> mdomsch: i got the acount moved back out of being consolidated
19:55:07 <mdomsch> rbergeron: great, thanks!
19:55:18 <dgilmore> mdomsch: good uptake of f17?
19:55:22 <rbergeron> mdomsch: i have no idea if it's back to *free* but...
19:55:29 <mdomsch> dgilmore: no idea - someoen needs to look at the stats
19:55:52 <mdomsch> fwiw, stats are downloaded daily to log02, where they could use to be processed
19:56:35 <mdomsch> s/stats/logs/
19:56:58 <rbergeron> #idea need someone who knows about s3 log files, processing, and reporting, to do something useful with teh bucket logs
19:57:08 <mdomsch> this is our best method of telling how many people actually use our images in EC2
19:57:38 <rbergeron> #link http://ihasabucket.com/
19:57:59 * ke4qqq has done some s3 log processing in the past - if no one gets to it before me, and I ever get some time freed up I might take a look at it.
19:58:22 <mdomsch> there are some for-pay services like http://www.s3stat.com we could use too
19:58:26 <rbergeron> ke4qqq: at least having some documentation might be helpful or a "here's what to do" type of thing
19:58:50 <rbergeron> ie: i wouldn't even know where to get started (surprising, i know)
19:59:27 <mdomsch> at $5/month, that might be the easiest way to make use of the logs we have
19:59:36 <ke4qqq> indeed
19:59:59 <rbergeron> mdomsch: hmm, interesting
20:00:24 <rbergeron> mdomsch: can you elaborate on wha tdata we'd want to glean out of that - other than "people are using this stuff"
20:00:34 <mdomsch> rbergeron: that's most of it honestly.
20:00:48 <mdomsch> some idea of number of VMs
20:00:48 <rbergeron> mdomsch: okay
20:00:53 <mdomsch> by version and region
20:00:54 <rbergeron> fair 'nuff :)
20:01:01 <rbergeron> yeah, i think the version, region info would be helpful
20:01:04 <mdomsch> right now it's "we hope we have some users"
20:01:10 * gholms reappears
20:01:28 <rbergeron> i am apparently late for a meeting.
20:01:42 * rbergeron is happy to pass the baton off to someone else
20:02:14 <rbergeron> mdomsch: i agree
20:02:36 <rbergeron> oay. anyone else?
20:02:41 * rbergeron slaps her k key
20:03:20 <gholms> [Meanwhile, back at the ranch...]
20:03:24 <maxamillion> gholms: lol
20:03:52 <rbergeron> thanks for coming, all :)
20:04:00 * rbergeron looks forward to fad discussions :)
20:04:02 <rbergeron> #endmeeting