debuggable

 
Contact Us
 
12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20

CakePHP Meetup this Friday in Berlin

Posted on 29/1/09 by Tim Koschützki

Hey folks,

as we have announced a week ago there will be a meetup of fellow Cakers in Berlin tomorrow. Here are the details:

Location:

Amaretto Due
Rigaer Strasse 30 (Ecke Gabelsberger Str.)
Check on Google Maps
http://www.amaretto-due.com/

Time

8pm GMT + 1 (German time).

Why this bar

It's the bar we always go to. Food there is good and so are the prices (Cocktails 3EUR each).

How do I find you?

Felix and I are tall men who will wear pink socks with small red hearts on the shoes. There will be some beer on the table. You cannot miss us.

What to bring?

Fun and the will to talk about what you are currently doing with CakePHP and what you plan to do. :) You can also bring your laptop but we've opted for a no-wifi location so we can focus on talking not surfing!

-- Tim Koschuetzki aka DarkAngelBGE

 

Donate your PHP arrays!

Posted on 29/1/09 by Felix Geisendörfer

Today I needed a list of mime types and extensions to determine the proper extension to assign to a file with a certain mime type. It wasn't difficult to find one.

However, as always when looking for tables and stuff like this on the web, I had to convert the data for using it in PHP. Usually I'd use my text editor and a few regex for the job.

But not today. Today was the last time I wasted energy on importing trivial data sets into PHP.

Why? Well, because I believe there is a fortune lost in man hours in the process on a daily basis. It is time for stopping that.

It is time for a new project called php arrays. (Warning: don't open anything but extensions.php / mime_types.php in your browser, it may freeze due to client-js syntax highlighting. Click the download link of the whole repo instead.).

Basically it is just a collection of PHP arrays, each containing one big array. So far I have done a little work in making two generators that currently produce the following arrays: big_cities.php (population > 15000), countries.php, extensions.php, mime_types.php and time_zones.php. Whenever I need to work with other stuff I'll add it there as well.

Now it is time for you to join the fun and share PHP arrays you generated or aggregated in the past. Fork the project on GitHub or send me a link to your array paste.

-- Felix Geisendörfer aka the_undefined

PS: What other arrays would like to see added to the collection in future?

 

Suppressing suppressing PHP errors with emptiness

Posted on 28/1/09 by Felix Geisendörfer

PHP's language constructors that disguise as functions are a bitch! I didn't know that empty() does not throw errors when accessing non-existing array keys, but it's actually in the manual. Thanks to everybody who pointed it out!

So please consider my previous post garbage as far as the actual example is concerned. The proper solution is:

if (!empty($step['options']['merge'])) {
  // do stuff
}

However, my threat of eventually using @ and breaking with other "best practices" remains. I will use whatever solution solves more problems for me then it creates ; ).

--Felix Geisendörfer aka the_undefined

 

Suppressing PHP errors for fun and profit

Posted on 28/1/09 by Felix Geisendörfer

Update: The conclusions below are wrong, empty() is the best solution to the particular problem, read the follow up post for more information.

As of late I am getting sick of some best practices I have taught myself. Never using the @-error suppressing operator quickly moving to the top of the list.

Before you start crying out loud (I know you will), let me say this: I do not mean to encourage anybody to use the @-operator. Applying the practice herein introduced may result in permanent damage to your coding habits and could serve as a gateway behavior to writing shitty code.

Ok, you have been warned, lets move on. Today I found myself writing the following line of code:

if (isset($step['options']['merge']) && $step['options']['merge']) {
  // do stuff
}

I was basically checking for a sub-key of an element to be true(ish). However, I knew this sub-key may not be included in the array under some circumstances, so a direct check for it could result in a PHP warning.

You can probably feel the pain of conflict here. On one side I want to write very short and expressive code. On the other side I am also trying to follow the so called "best practices". It is impossible.

Of course there is always the option to normalize the array before doing the check:

$step['options'] = array('merge' => false) + $step['options'];
if ($step['options']['merge']) {
  // do stuff
}

This however still adds a basically useless line of code. Plus normalizing a multi-dimensional array like this requires a more sophisticated merging algorithm than PHP provides you with (CakePHP's Set::merge() function provides good multi-dimensional array merging, but that would be throwing even more code and CPU cycles at the problem).

So today I have, for probably first time in years, I wrote the following line of code:

if (@$step['options']['merge']) {
  // do stuff
}

It still causes me mild pain to look at. I hope for that to go away soon. But I no longer have the strength to close my eyes to the very true lesson to be learned here: Using the @-operator like this solves more problems for me than it creates.

This may not be true for you. Using the @-operator is 4x slower than solution #1 and 1.3x slower than #2. If you run a loop with 100000 records, this is 150ms wasted (on my laptop, your mileage may vary). However, my scenario is more like running into this 5-10 times for any given request. So we are talking about 0.0015ms here.

It is not like I don't care about my CPU-cycle footprint. The opposite is true. By sparing myself the pain of typing up and maintaing more verbose-than-necessary code I can save millions of CPU cycles elsewhere! I can go and optimize stuff that actually needs to be optimized.

How will you choose?

-- Felix Geisendörfer aka the_undefined

PS: In case you wonder, @-suppressed errors do not show up in any error logs. So no problems with that.

 

Quickly generate tons of test data

Posted on 23/1/09 by Felix Geisendörfer

I am lazy. I think it's because I was raised a programmer and grew up imagining a world where computers would do all the work for me.

Unfortunately however, our binary offspring is still asking for a great amount of our attention and time before we can retire and hand the work over to our fabulous creations.

But that doesn't mean we can't make the kids clean our dishes, wash our clothes or generate our test data! So at a point earlier last year I was working on my own test data generator, but it turns out it is quite a ton of work to aggregate good data from various places and create a nice generator based upon it.

No worries, the story has a happy end. The universe is kind to the lazy and hearing my cry of need created a whole array of products and web services to address the problem!

Hello generatetestdata.com

From the solutions I tried to far, generatedata.com made by the friendly folks at Black Sheep Web Software has worked the best for me.

However, it took me a little time to streamline my workflow when using it, so here are a few tips you should definitely know about when using the service:

Tip 1: Generate UUID's

If you are using UUIDs in your application and you want to generate random UUIDs there is currently no option for that at generatetestdata.com. However, selecting the type "Alpha-Numeric" and leaving the drop down at "Please select", you can enter your own pattern for the string to be created. Here is the one I came up with for UUIDs:

lxlxlxlx-lxlx-lxlx-lxlx-lxlxlxlxlxlx

Tip 2: Generate custom strings

If you need even more customized strings, you can use the placeholders provided by the service. As you can see I simplified UUID generation above by simply combining random lowercase letters with random numbers - not even close to perfect UUIDs, but very kick-ass for test data ; ).

L	An uppercase Letter.
l	A lowercase letter.
D	A letter (upper or lower).
C	An uppercase Consonant.
c	A lowercase consonant.
E	A consonant (upper or lower).
V	An uppercase Vowel.
v	A lowercase vowel.
F	A vowel (upper or lower).
x	Any number, 0-9.
X	Any number, 1-9.

Tip 3: Connect to foreign keys

The whole generator is kind of nice and all, but if the table you are generating data for has foreign keys that point to different tables you need those links to work, right? After all what do you need test data for if not to see the parts of your app working together ; ).

Anyway, it turns out this is not a particular difficult problem either. From the data type drop down select "Custom List" and again leave the "Please select"option in the second drop down where it is. Now you can provide your own list of strings separated by the "|" (pipe character) to be used for this field.

How do you get a good list that matches your foreign keys? Easy. Let's say you have a key "user_id" in your table and you need a good list for that, simply run this MySql query:

SELECT GROUP_CONCAT(id SEPARATOR '|') FROM users;

This will return an already perfectly formatted list ready to be copy & pasted into the list field! And as you can see you can get very fancy at this point by providing conditions and other things to get the data you want! This can also be handy if you already have a set of data (like path's to uploaded images <- sry, you'll still have to do this manually) that you want to re-use.

Tip 4: Quickly getting the records into your table

For me the quickest way to get the records in the table so far has been to select "Sql" as the result type on top, enter my table name at the bottom and disable the "create table" option. Then after hitting generate I hit Command+A to select all items and paste the whole thing into my MySql terminal (you can also use a guy client like phpMyAdmin for that).

If you are on OSX you can also pipe your clipboard into mysql directly like this:

pbpaste > mysql my_db

Start using big sets of test data now

There is no point in always testing a sophisticated app with a set of < 10 records you entered by hand. Especially if those are full of profanity and can't be handed of to a client anyway ; ).

For example I currently am working on a very nice dashboard for our client to help analyze some click data we are tracking. I started out with just a handful of clicks I generated manually, but realized I needed 1000++ records to see if my Sql records are good enough and to see the charts become all pretty. Needless to say, I caught several issues that only became apparent through the amount of data.

So if you develop a system, and even if it's very small - generate some good test data for it today. Nothing is a worse productivity killer then not being able to test something properly due to the lack of data.

-- Felix Geisendörfer aka the_undefined

PS: If you are re-doing an existing system or otherwise have the chance to use "actual" data to test with, spend some time on getting that going. Real records are always better (read worse but in a good way) to test your application.

 
12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20