Apple

Synchronized

When you work across multiple devices and multiple computers on a daily basis, keeping the information you expect to be there the same across all of them used to be a monstrous pain. This is where synchronization comes in.

I have 3 “computers” I use every day: my iMac, my Macbook Pro, and my iPhone. On each of those computers, I have several programs that may need to access the same type of data.

Bookmarks are synchronized using Xmarks. This allows me to sync them across Safari, Google Chrome and Firefox. And because the bookmarks are sync’d to Safari via a background process, I can use Mobileme to sync them to my iPhone. All this happens in the background, without me having to think about it. I just add a bookmark somewhere, and minutes later it’s reflected everywhere else.

Email rules, accounts and signatures are synchronized via Mobileme and appear on all my computers and my iPhone. Contacts are sync’d via Mobileme and appear everywhere. Same with calendars, except calendars is the real win. I can make an calendar entry on my iPhone, and it’s instantly sync’d to my calendars on my laptop and desktop.

I have some files and programs that I need access to, I sync those with Mobileme across all my devices via iDisk. I can access those everywhere, even on my iPhone. I even created a directory in there called “Scripts;” with a change to my bash path on my Macs, any scripts I write are sync’d too.

And all this stuff happens more or less instantly and completely transparently to me. Via the Internet and over the air for the iPhone. I don’t even have to plug anything in. It just happens. I can’t believe computers ever worked any other way, and there is no way I can do without it now.

Xmarks is free. Mobileme is $99 a year, but totally worth it simply in the headache I save in not having to deal with disparate data spread over 3 devices.

Read More
Apache

MySQL-based Apache HTTP Authentication for Trac and Subversion

In working on a side project with a few friendly developers, we decided to set up a Subversion repository and a Trac bug and issue tracker. Both of these, in normal setups, rely on HTTP authentication. So, being that we already had an authentication database as part of the project, my natural first thought was to find a way to authenticate Trac and Subversion of these against our existing MySQL authentication database rather than to rely on Apache passwd files that would have to be updated separately.

Surprisingly, this was more difficult than it sounded.

My first thought was to try mod_auth_mysql. However, from the front page, it looks as if this project has not been updated since 2005 and is likely not being actively maintained. Nonetheless, I gave it a shot and, surprisingly, got it mostly working against Apache 2.2.14.

Notice I said “mostly.” It would authenticate about 50% of the time, while filling the Apache error logs with fun things like:

[Sat Feb 13 11:11:27 2010] [error] [client -.-.-.-] MySQL ERROR: Lost connection to MySQL server at 'reading initial communication packet', system error: 0
[Sat Feb 13 11:11:28 2010] [notice] child pid 19074 exit signal Segmentation fault (11)
[Sat Feb 13 11:34:14 2010] [error] [client -.-.-.-] MySQL ERROR: Lost connection to MySQL server during query:
[Sat Feb 13 11:34:15 2010] [error] [client -.-.-.-] MySQL ERROR: MySQL server has gone away:`

Rather than tear into this and try to figure out why a 5-year-old auth module isn’t working against far newer code, and with very little to actually go on, I just concluded that it wasn’t compatible and looked for a different solution.

That’s when I came across mod_authnz_external. If your’e not familiar with this module, what it allows you to do is auth against a program or script running on your system, therefore allowing you to auth against anything you want - a script talking to a database, PAM system logins, LDAP, pretty much anything you have access to. All you have to do is write the glue code.

In pipe mode, mod_authnz_external uses pwauth format, where it passes the username and password to stdin, each separated with a newline. It uses exit codes to return back to Apache whether or not the login was valid. Knowing that, it’s pretty easy to write a little script to intercept the username/password, run a query, and return the login.

#!/usr/bin/php
<?php`

include "secure_prepend.php";
include "database.php";

$fp=fopen("php://stdin","r");
$username = stream_get_line($fp,1024,"\n");
$password = stream_get_line($fp,1024,"\n");
$sql = "select user_id from users where username='%s' and password='%s' and disabled=0"; $sql = sprintf($sql, $db->escape_string($username), $db->escape_string($password));

$user = $db->get_row($sql); if(!empty($user)) { exit(0); } exit(1);

?>

Then, you just hook this into your Apache config for Trac or Subversion:

AddExternalAuth auth /path/to/authenticator/script
SetExternalAuthMethod auth pipe

<Location />
    DAV svn
    SVNPath /path/to/svn
    AuthName "SVN"
    AuthType Basic
    AuthBasicProvider external
    AuthExternal auth
    require valid-user
</Location>

Restart, and it should be all working.

Some may argue that the true “right” way to do this is LDAP. But with just three of us, LDAP is overkill, especially when we already have the rest of the database stuf in place. The big advantage to this, even over mod_auth_mysql, is the amount of processing you can do on login. You basically can run any number of queries in your authenticator script - rather than just one. You can update with last login or last commit date, for instance. Or you can join tables for group checking; say you want someone to have access to Trac, but not Subversion. You can do that with this.

Read More
Conferences

OSCON 2009 Summary

Have to say that, everything that didn’t involved air travel (I’ll go ALL into that later) was awesome on this trip. Had a good time and learned some useful things at OSCON, enjoyed good company and had a good time exploring San Jose and the Bay Area in general.

OSCON was good this year but not as good as in years’ past. This may be due to the new location, which doesn’t seem as conducive as the Oregon Convention Center did to a conference like this. The OCC was round, and all the meeting rooms were clustered in a central area - there was never more than a short walk between panels. But the San Jose Convention Center is more of a traditional box design, with a single LONG hallway. This means that if you’re in J3 and have to go to B2, good luck, because it’s a 15 minute walk. For a conference like OSCON, this kind of sucks and absolutely kills the “community” feel of it.

Also, like many things, it suffers from diminishing returns. Because a lot of this is stuff I’ve seen before, every year that I come, I have to work harder and harder to find something new. Three years ago, I was doing well to decide what not to learn about. So this may be my last OSCON for a few years, though I’m thinking of attending Velocity (held down the road at the Fairmont) next year.

I did attend some interesting side panels, including one on home automation. I have some ideas that I’m sure will drive Sarah crazy.

Read More
Ramblings

We Live In The Future

The computer on my desk has one TERABYTE of space, and it’s almost half full - ten years ago I didn’t even had a gigabyte of space in my main machine. I carry a computer … in my pocket … that I can use to surf the web anywhere. No wires. And I can use my pocket computer to show pictures of our vacation half a world away to a friend over a dinner of seafood. We’re hundreds of miles from the coast. And I can use this same device to call around the world at any time.

In a few months, I’m going to get on an airplane and fly from my home in Alabama to London. I’m going to FLY. Through the air. And I’m going to be there in a little over 10 hours. A hundred years ago to get from London to New York took like two weeks via steamship, and then you had to travel by train and horse carriage. It could take a month or more to travel that distance, but I’m gonna do it in 10 hours.

When I was a kid, our TV got 3 channels on an old 19” TV that took minutes to warm up. My TV today has close to 500 channels, all of them perfectly clear and some of them in beautiful high-definition. Oh yeah, and it’s 42” wide, less than the width of a ream of paper, and turns on almost instantly.

I never go to a bank anymore. My paycheck is electronically deposited to a bank that has one physical location … in Texas, more than five hundred miles away. And if for some reason, I do have to deposit a check, I can scan them in at my house and send them by computer to be instantly deposited.

People, welcome to the future. We’re here. And it’s just going to get cooler.

Read More
Ramblings

RIP Michael Jackson

I grew up in the 80s, against the backdrop of Michael Jackson’s music. I remember my parents listening to Thriller, Billie Jean, and Beat It. In many ways, Michael Jackson defined music in the 80s.

Read More
Microsoft

Why Bing Sucks

So I see Microsoft’s is attempting to rebrand the old Windows Live Search as bing.com. The commercials on TV are advertising it as a different type of search engine - a “decision engine.” Yeah, when I heard that, I, too, wondered exactly what a “decision engine” was. But the commercials are clever and somewhat funny to anyone who has ever spent time searching through hundreds of results for a single missing piece. But where’s the meat?

My coworker Brian, a few weeks ago, provided a great example of how this claim of being a “decision engine” is kind of a joke. And it can be summed up in a single sentence: “How big is the sun?”

Maybe now you’re confused about what I’m talking about. What does the sun have to do with search engines? Well, try plugging that sentence, word for word, into your favorite search engine. Our of curiosity, I ran this search on a number of top and up-and-coming engines to see what they returned.

  • Google is obviously the 900-pound gorilla in this space, so they’re a logical place to start. When you ask Google “How big is the Sun?” Big Brother Google replies, right at the top “Mass: 1.9891 ×1030 KG 332 946 Earths,” with most of the results relevant to the question at hand. In fact, all but two of the results were directly relevant to the question asked.

  • Yahoo didn’t return a nice little piece of math like Google did, but all but one of the search results is _directly _relevant to the question asked. The only result that wasn’t relevant was that VH1 has some videos by a band called Big Sun, but that was torwards the bottom of the SERP.

  • The newcomer Wolfram Alpha, which bills itself as a “knowledge engine” gives you a simple result, 432,200 miles, along with a handy formula for conversion. Not a traditional search engine, but closer to a “decision engine” than Bing …

  • And finally, the “decision engine” Bing. So how does the vaunted “decision engine” handle knowing how big the sun is?It doesn’t.

The first result is a garden furniture store in Austin, Texas. The second result is an Equine Product Store in Florida. The third was pictures of the sun from the Boston Globe - okay, that one was close. The next results are a realty company in Florida and an athletic conference. Only then, six results down, do we get into the meat of the question.

Look, it’s easy to hate on Microsoft. It’s no challenge anymore. I, personally, am not exactly a fan of Microsoft, but I’m hardly an enemy either. At worst, I’m indifferent.

And, as an aside, I really feel sorry for the poor guy they send to the OSCON keynote every year who literally gets hammered for no good reason by what can only be described as nerd rage from the questioners. And yet every year, they come back with more money and more people. I almost posted an entry about it last year. It was really kind of sad to watch.

Anyways, the point is, there are some things that Microsoft _has _done well. Office? Great productivity suite. Windows 7? From what I’ve seen, it looks pretty good. The XBOX and gaming units at Microsoft do gangbusters. But it just seems like they’re irrationally pursuing this search thing, out of spite, at this point to the detriment of the rest of their business. Considering that bing doesn’t appear, at the surface, to be any different from Windows Live Search in terms of its usefulness (that is to say, not), Microsoft is throwing tons of money in the form of development and marketing to something that just isn’t very good when they could be focusing on the core parts of their business.

But, then again, I’m not Ballmer.

Read More
Ramblings

Iran Elections ... or ... will the revolution be Twittered?

I’m sure many of you have been following what’s been happening in Iran, right? Or maybe you haven’t because, like often happens in international events, the American media has dropped the ball in the most epic of fashions. And I’m talking Ed Scissum (God bless him) fumbling deep in Bama territory to give Auburn the win in the ‘97 Iron Bowl dropping the ball. It’s been that bad.

Read More
PHP

Drama? In My Developer Community?

… it’s more likely than you think!

And here I thought drama was isolated to fandom mailing lists and MySpace!

I was not at php tek this year. I keep meaning to make it to that conference, but, let’s face it, the week before Memorial Day is a really lousy time to have a conference. I usually like to take that Friday off to make it a long weekend. I may finally make tek next year, though. But, even if I went, I don’t usually get invited to the cool parties. It’s really for the best, though. I usually end up drunk in a bar listening to good music rather than trying to discuss functions and benchmarking after having imbibed a large quantity of booze or making an ass out of myself by diving into bushes. Ask me about that some other time.
Apparently, at php tek, at one of these “cool-people-only” parties (okay, it was apparently an after-hours panel), a bunch of people cooked up this idea of having a uniform PHP coding standards amomg their own projects with the goal of having them adopted as some type of official standard. Now, in and of itself, this sounds like a good idea. Most other languages have at least a suggested best practices (Sun’s coding conventions for Java or Apple’s for Cocoa come to mind) even if you don’t use them. Every job I’ve worked in has had some standard, even if I had to write it. Most of them were derived from the PEAR standard, including what we do at dealnews. But hey, variety is the spice of life, right? What’s the harm in another choice?

Nothing. So we’ve established that the idea of havng a[nother] PHP coding standard is not necessarily bad. The problem, as with all things, is what happened next…

  1. Somehow, they managed to get a closed mailing list on php.net. Think about that for just a second. This group, composed of some guys from some projects with no official relation to PHP other than being users of it, somehow ended up with [email protected]. WTF? I would love to know how that happened.More to the point, this will cause conceptual confusion among new, and even existing users. When I first heard about this, my first thought was, hey, this is on PHP.net, right? It must have some kind of official recognition, right? Well, as far as I can tell, it doesn’t. It’s just … some guys. Put yourself in the shoes of a new PHP user, visiting PHP.net for all your manual needs. Oh, what’s this? Standards? Well, I better use those!

  2. It was a suspiciously closed action for such an open-source project. The original mailing list was a closed list until Rasmus himself opened it, and the members don’t exactly seem keen on welcoming any input from anyone outside their little clique.Some of the things being said by the “PHP Standards Group,” quite frankly, make me very suspicious of their motives. Things like “All of us are too busy, both with real jobs and our various projects, to fight the battles that come of trying to make this a completely open process where anyone with an email address can contribute” reek of self-aggrandizing nonsense.

I’m sorry, but that’s bullshit. Plain and simple. And the fact that no one else in the group has stood up to say otherwise speaks volumes. There’s a phenomenon that I have seen occur on mailing list called implicit acceptance. If you don’t stand up and say otherwise, you are implicitly agreeing with the stated course of action. So, if anyone in this group disagrees with the stated opinions, guys, now’s the time to man up.

If you’re going to have a mailing list on php.net, and call yourselves the “PHP Standards Group,” you need to welcome input from the PHP community - all of us - not just your group. Otherwise, you don’t need to be on php.net, and you don’t need to be calling yourselves the “PHP Standards Group.”

  1. It is overly focused on OO. I know a lot of people think that objects are the answer to everything. I have strong disagreements, but I will save those for a later post. But (kind of tying into my previous point) there are a _lot _of people using PHP in a strictly functional way or in a way that sanely mixes functional and object oriented programming. Any standard - if it’s going to be called a PHP Standard - needs to take all widespread uses of PHP into accout, and not just OO.

Now, as I said before, I’m not a “cool person.” I don’t have CVS commit access. I don’t have thousands of followers on Twitter or a cool blog (no offense to my five regular readers - you guys rule and I’ll buy you a round sometime!). I’m just some guy who’s been writing PHP for the last nine years or so. So, while it appears this “group” probably won’t care what I have to say anwyays, here is my humble suggestion for a path forward.**

**Figure out the semantics. **Notice that all this stuff we’re talking is appearances and semantics. Nobody is discussing the actual proposals (as they have been made) so far, just the actions of the people involved. What exactly is this project trying to accomplish? Are you trying to write a standard for your project(s), or are you trying to produce something useful for the community? If this is just for your project(s), move it off php.net, call it something else (“The Shared Standards Working Group” or some other such nonsense), and do whatever the hell you want. But if you’re going to call yourselves the “PHP Standards Group,” and have your project on PHP.net, you have to welcome input from the community, even if you ultimately discard it.

The thing I don’t understand is why this group appears so afraid of public input? Okay, the signal-to-noise ratio can get pretty high sometimes, sure. But for every ten, hundred or five hundred bogus suggestions you get, you may get one really good one. One you might not have thought of yourself or no one in your tight little circle might have seen. And this is the true power of any open-source project. I would urge the “PHP Standards Group” to overcome their fear of public input and let us - the users - have an input in the community process.

As always, this represents my own views only, and not those of my employer, the beer I’m drinking (Fat Tire Amber) or my cat.

Read More
Apache

PECL memcache and PHP on Mac OS X Leopard

Wow, has it really been that long since I’ve written here? I really need to do better.

So tonight I ran into an interesting issue this evening in configuring PECL memcache to run on my Macintosh. To give you a bit of background, I use the built-in copy of Apache, but with PHP (current 5.2.8) compiled from source since the version in Leopard is old and I needed some things that it didn’t provice. After that was installed with no problems, I went to the ext/memcache-3.0.4 directory to compile memcache as so:

phpize
./configure
make
make install

Then added it to php.ini as an extension and restarted apache. But it didn’t work. The information returned from phpinfo() still indicated it had not been installed. So I checked the logs and found this little gem:

PHP Warning:  PHP Startup: Unable to load dynamic library '/usr/lib/php/extensions/no-debug-non-zts-20060613/memcache.so' - (null) in Unknown on line 0

Okay. WTF does that mean?

While Googling around for an answer, I came across this page. According to it,it’s a strong indication that you’ve likely compiled against the wrong architecture! This is an indication that the shared extension is causing a segmentation fault. Fortunately, there is a solution - force configure to use the right architecture.

make clean
MACOSX_DEPLOYMENT_TARGET=10.5 CFLAGS="-arch x86_64 -g -Os -pipe -no-cpp-precomp" CCFLAGS="-arch x86_64 -g -Os -pipe" CXXFLAGS="-arch x86_64 -g -Os -pipe" LDFLAGS="-arch x86_64 -bind_at_load" ./configure
make
make install

Now restart apache. You should have working memcache!

Read More