1) Forums : Technical Support : SELinux issues with Fedora Core 17 (Message 11970)
Posted 6 Oct 2012 by Profile Martin Ryba
I ended up having to give up C@H on my FC17 box (or adding others) since I get several SELinux faults from C@H (invoking execmem, scanning protected folders, etc.), and even when I try to use the policy editor to get past them, I'm still getting them and its causing jobs to abort. Any others seeing these errors and have workarounds? Is there something the code does that was OK with older BOINC installs and/or more permissive OS's that is not good security practice? I don't see this in other (better maintained) projects like Einstein and Milkyway.
2) Forums : General Topics : minimum spec of old pcs? (Message 8784)
Posted 21 Dec 2009 by Profile Martin Ryba
The biggest issue with C@H is memory usage; it tends to grow over the course of the work unit, maxing out at around 500MB (yes, half a gig). If your machine has less than 768MB of RAM (my oldest one that runs C@H successfully), then don't try it. The application will tend to automatically exit depending on your settings if you have a 512MB machine, since it checks for at least 470MB or so available when it launches. At least it did on my old laptop.


3) Forums : Cosmology and Astronomy : CMB and Dark Matter (Message 8623)
Posted 8 Oct 2009 by Profile Martin Ryba
I have an admittedly uninformed question about this. So we observe that stars at the edge of galaxies are moving "too fast", that at those speeds they should escape the galaxy's pull on them unless there was some other unknown force holding them in. The same is observed in galaxy clusters spinning around each other. Instead of concluding there must be a "dark matter" creating extra gravity, why don't we conclude some of our equations for determining these things have errors or are incomplete? Maybe there's a point at which general relativity breaks down. Maybe at a galactic level, the effect on the space-time fabric is different than we expect. An error or missing factor small enough to be unperceivable on small things such as our solar system, but tremendously visible at the galactic level.

By coining the error, "dark matter", it may be misdirecting people's efforts and attention the wrong way. Why not just say, "there's an unknown force or error in our understanding of the universe?" Why make it an object?

While not a cosmologist, I have some experience with gravitational theories. I think the biggest problem with this explanation is that at least so far most attempts to put some distance-dependent modification of the inverse-square law run into problems in that we have some more local but high accuracy measurements (e.g., binary pulsars, solar system radar, etc.) that would go seriously haywire if you mess around with 1/r^2. Of course, that doesn't preclude some future theorist from hitting upon the right way to do it and make some of these other options look downright silly.
4) Forums : Technical Support : High memory usage (Message 8448)
Posted 22 Jun 2009 by Profile Martin Ryba
The memory issue should be thought about in detail. I was frustrated when CAMB wouldn't even start on my laptop (it only has 512MB physical, and CAMB wants 492MB to start, so that doesn't leave enough for Windows). It was more frustrating when my first WU on my desktop (768MB physical) crunched for 40 hours and then died at about the 80% point. I had the Windows dialog indicating it was increasing the page file size, so memory growth was the likely cause. Depending on how you're using those chunks of memory you could look at compacting it somehow, or just warn people up front they need a 2GB machine.