cloud_sig
LOGS
18:59:21 <gholms|work> #startmeeting Cloud SIG (25 Mar 2011)
18:59:21 <zodbot> Meeting started Fri Mar 25 18:59:21 2011 UTC.  The chair is gholms|work. Information about MeetBot at http://wiki.debian.org/MeetBot.
18:59:21 <zodbot> Useful Commands: #action #agreed #halp #info #idea #link #topic.
18:59:37 <gholms|work> #meetingname cloud_sig
18:59:37 <zodbot> The meeting name has been set to 'cloud_sig'
18:59:42 <gholms|work> #chair rbergeron jforbes
18:59:42 <zodbot> Current chairs: gholms|work jforbes rbergeron
18:59:46 * rbergeron will be here shortly
18:59:46 <gholms|work> #topic Roll call
18:59:53 <gholms|work> Who's here today?
18:59:55 * mgoldmann raises his hand
19:00:10 * rbergeron is sort of at hte moment more of me shortly, whether ya want it or not
19:00:25 * gholms|work looks for brianlamere and jforbes
19:00:27 <obino> o/
19:00:28 * jforbes is here
19:01:21 <mgoldmann> What is a "#halp" command useful for? :)
19:01:22 <gholms|work> Ok, let's get started then.
19:01:32 <ksk2> *** kkeithley is here
19:01:32 <gholms|work> mgoldmann: It's an alias for #help, meaning "call for help"
19:01:40 <gholms|work> #topic EC2
19:01:42 <mgoldmann> gholms|work: thanks!
19:02:20 <gholms|work> jforbes: Anything new this week?
19:02:52 <jforbes> gholms|work: no, I failed to get the email I needed sent, doing so right now so I dont forget
19:03:34 <gholms|work> #help EC2 "Unspin" wiki page needs some love:  http://fedoraproject.org/wiki/EC2_unspin
19:04:06 <gholms|work> Anyone else have anything for this topic, then?
19:04:24 <jforbes> It does need love, though I need to see exactly what kind of information needs to go there.
19:04:24 * dcr226 has, whenever open-floor happens :-)
19:04:47 <gholms|work> #topic Aeolus
19:04:52 * gholms|work has a hard time typing that
19:05:12 <gholms|work> Anyone from that group around today?  I don't see clalance.
19:05:29 <gholms|work> http://aeolusproject.org/page/Packages_Missing_From_Fedora
19:05:39 * clalance here (late)
19:05:55 <gholms|work> Just in time!  :)
19:05:57 <mgoldmann> clalance: right on time :)
19:06:51 <mgoldmann> any new cool stuff in Aeolus?
19:07:26 <clalance> We are in the middle of integrating the new imagefactory into aeolus.
19:07:45 <clalance> mmorsi: Do you have more detailed status about the conductor?
19:07:56 <mmorsi> hrm not too much this week, alot of bug fixing
19:08:07 <mmorsi> one nice thing we added is alot more seed data to our installation / configuration script
19:08:17 <mmorsi> more to be added as we go along
19:08:43 * sloranz here (lurking)
19:08:44 <mmorsi> so that launching instances on any cloud provider via aeolus will be as simple as possible (ideally one or two clicks after installation)
19:09:00 <mmorsi> but thats about it on my end, just alot of bugfixing going on
19:09:04 <mgoldmann> sounds great
19:09:12 <gholms|work> Neato
19:09:46 <gholms|work> Anyone else have more Aeolus-related info?
19:09:56 <gholms|work> (or questions)
19:10:27 <gholms|work> #topic CloudFS
19:10:49 * gholms|work looks around for cloudfs people
19:11:18 * ksk2 is here, but where's jdarcy?
19:11:37 <gholms|work> Beats me; I pinged him.
19:11:42 <gholms|work> Maybe he will show up later.
19:11:58 <clalance> gholms|work: Yeah, he is getting on.
19:12:19 <jdarcy> Hey all.
19:13:00 <gholms|work> What's new?
19:14:07 <jdarcy> Well, I spent most of the week in meetings where GlusterFS/CloudFS were discussed, but I can't say much more about that.
19:14:27 <jdarcy> I was late to the meeting because I was showing Ben (new perf guy) how to set up a GlusterFS filesystem.
19:14:45 <jdarcy> He'll be helping to test performance of many permutations of GlusterFS/CloudFS.
19:15:29 <jdarcy> Not much else to report, unless Kaleb wants to add something.
19:15:42 <ksk2> not much, was in the same meetings with Jeff
19:16:03 <ksk2> I got my dev workstation, and am setting up lots of kvm guests to do dev with
19:16:04 <jdarcy> I think what I can say about those meetings is that awareness of GlusterFS/CloudFS has been significantly heightened within RH, and people are thinking seriously about how to use them.
19:16:52 <jdarcy> Oh, and 3.1.3 came out, so we're in for another round of packaging fun.
19:17:45 <gholms|work> #info GlusterFS 3.1.3 released
19:17:53 * gholms|work hopes he got that right
19:17:58 * jdarcy nods.
19:18:11 <gholms|work> Anything else?
19:18:33 <rbergeron> jdarcy - did you see themayor loking for you yesterday in #fedora-cloud?
19:18:39 <rbergeron> looking
19:18:50 <rbergeron> he had some q's about glusterfs - i figured i'd point him your way
19:18:59 <jdarcy> Yep.  Chatting with him in another session right now.
19:19:02 <rbergeron> ah cool
19:19:09 * rbergeron arrives fashionably late, sorry
19:19:24 * gholms|work hands rbergeron a cup of lukewarm coffee
19:19:38 <rbergeron> thanks!
19:19:47 <gholms|work> mgoldmann: Shall we do a BG topic next?
19:19:58 <mgoldmann> gholms|work: sure, a quick update
19:20:04 <gholms|work> #topic BoxGrinder
19:20:25 <mgoldmann> I filled a ticket with rel-eng, but I'm not sure how important it is for F15: https://fedorahosted.org/rel-eng/ticket/4523
19:20:33 <mgoldmann> it's in the queue since 2 weeks now
19:21:02 <gholms|work> Maybe try pinging a rel-eng person on IRC?
19:21:13 <rbergeron> maybe dgilmore
19:21:18 <mgoldmann> ok, will do
19:21:27 <mgoldmann> other than this - after https://issues.jboss.org/browse/BGBUILD-156 gets fixed (monday) I plan to release 0.9.1 on tuesday or so
19:21:45 <mgoldmann> mostly a bugfix release: https://issues.jboss.org/browse/BGBUILD/fixforversion/12316138
19:21:53 <gholms|work> #info BoxGrinder 0.9.1 bugfix release coming soon
19:22:22 <mgoldmann> I was pulled to do some different stuff, but I hope to say soon more
19:22:37 <mgoldmann> (not very cloud related, but fedora definitely :))
19:22:44 <rbergeron> ohh interesting
19:23:04 <rbergeron> we look forward to hearing :)
19:23:17 <mgoldmann> sure :)
19:23:24 <mgoldmann> that's all from my side
19:23:41 <gholms|work> #topic Open floor
19:23:55 <gholms|work> Any cloudy questions?  Comments?  Tomatoes?
19:24:10 * dcr226 does
19:24:30 <gholms|work> Do for it
19:24:33 <gholms|work> *Go for it
19:24:33 <mgoldmann> added this: https://fedoraproject.org/wiki/Cloud_SIG/Meetings
19:24:34 <dcr226> I'm sure you guys have discussed it, but you are very close to a "free fedora" with EC2's free tiny instances
19:24:50 <dcr226> all that is needed is an EBS backed instance
19:25:35 <dcr226> I wondered if there was any intention at all for creating a suitable image, and indeed if any help was needed in doing so
19:26:04 <gholms|work> jforbes: What's the likelihood of our seeing EBS images?
19:26:31 <gholms|work> jforbes: It sounds like dcr226 might be willing to help.  :)
19:26:57 <jforbes> gholms|work: likelyhood is good since we have the script for conversion
19:27:47 <gholms|work> dcr226: Does that help at all?
19:27:49 <dcr226> thats cool, I personally use a tiny instance, sadly it has to be C5.5
19:28:06 <dcr226> and I was going to build a f14 EBS to use, but pointless if you guys are already on it :-)
19:28:08 <jforbes> dcr226: F15 should have EBS images at release.  We have the script
19:28:25 <dcr226> jforbes, cool. I'm happy to test that script if you want
19:28:32 <gholms|work> #info Fedora 15 should have EBS images on EC2
19:29:02 <jforbes> eh, that script has to tie in account info and a bunch of other bits, it isn't really a distributable item
19:29:12 * dcr226 thinks "Free Fedora" would be cool
19:29:25 <dcr226> jforbes, ok - well, just ping if you want any help/EC2 testing
19:29:28 <jdarcy> "Freedora"
19:29:32 <gholms|work> jforbes: Did you learn anything account-related from spevack?
19:29:35 <jforbes> dcr226: we will
19:29:45 <jforbes> gholms|work: that was the email I hadnt sent out and just did
19:29:49 <dcr226> jdarcy, ship it! :-)
19:29:50 <gholms|work> Awesome
19:30:15 <gholms|work> #help Update wiki plzkthx:  http://fedoraproject.org/wiki/Cloud_SIG
19:30:37 <mgoldmann> gholms|work: I started this jsut before meeting
19:31:04 <gholms|work> A whole lot of the wiki is marked "Space for rent" or full of cobwebs.  Filling it out or updating it would be most appreciated.
19:31:13 <gholms|work> Thanks, mgoldmann
19:31:17 <mgoldmann> I created https://fedoraproject.org/wiki/Cloud_SIG/Meetings
19:31:39 <kisielk> is there any chance of seeing another AMI for F14, with bigger root? Or at least some instructions on how to build one? :)
19:32:37 <kisielk> that's my #1 problem right now, I'm willing to do the legwork if I can get some pointers on how to do it
19:34:17 <gholms|work> [A dog barks in the distance]
19:34:35 <gholms|work> obino: Ever resized an emi's root filesystem before?
19:34:56 <obino> on a running instance?
19:34:56 <dcr226> wasn't there a working ami-build script floating around the ml?
19:35:18 <gholms|work> obino: Presumably by re-bundling.  I don't think you can resize a running instance's.
19:35:31 <obino> exactly :)
19:35:37 <obino> I do resize images offline
19:35:43 <obino> that's easy
19:37:33 <gholms|work> obino: Any pointers/howtos/etc?
19:38:23 <obino> http://open.eucalyptus.com/wiki/resizing-images
19:38:34 <obino> http://open.eucalyptus.com/participate/wiki/changing-size-images
19:39:24 <gholms|work> Just keep in mind that EC2 limits root filesystems' sizes to about 10G.
19:40:07 <clalance> There was a rumor that they lifted that restriction.
19:40:10 <clalance> I have no idea if it is true.
19:40:58 <gholms|work> Oh, that reminds me!  I pushed a new euca2ools package to epel-testing for el5 that fixes the last known bug.  So if you have been wanting to use it, that's where you can find it.
19:41:17 <gholms|work> #info euca2ools package for el5 pushed to epel-testing; no known bugs
19:41:24 <kisielk> 10G is good enough for me, it's just 2G is completely inadequate :)
19:41:36 <gholms|work> Installing a lot of packages, eh?  ;)
19:41:53 <kisielk> yes, scientific software is large
19:42:02 * gholms|work hopes it isn't root
19:42:07 <gholms|work> Root is huuuuge
19:42:20 <kisielk> homegrown stuff :)
19:43:00 <gholms|work> Anything else for open floor?
19:43:06 * gholms|work looks at rbergeron
19:44:42 <gholms|work> All right.  Thanks for coming, everyone!
19:44:46 <gholms|work> #endmeeting