First two weeks of dandelion

I don’t write here anymore. You can now find me at coderchrismills.com.

I originally wrote this post on 2 May, but forgot to post it.

Well, the app has been out for two weeks now and it’s time for the reflection post. Since the app is free and I didn’t really intend for it to have any monetization scheme I decided on no analytics in it. The only data I have is from Apple (not their new fancy analytics thing) is the old iTunes Connect one. So, after two weeks, let’s look at some data.

Continue reading

Dandelion Breeze – Now Available

I don’t write here anymore. You can now find me at coderchrismills.com.

So after a getting rejected and updating the app, Dandelion Breeze was accepted this morning and is now available in the App Store. Really excited to finally have it available to people.

One of the more interesting parts of developing watch apps was going through this initial submission process. I read plenty of docs, stayed on top of the forums, and did everything I could to make sure that I would make it through first try. Didn’t make help, still got rejected.

This sums up how I felt going into review, so when I was rejected it wasn’t a terrible surprise. The reason for my rejection was the following,

10.1 – Apps must comply with all terms and conditions explained in the Apple iOS Human Interface Guidelines

10.1 Details – Your Apple Watch app icon does not comply with the Apple Watch Human Interface Guidelines.

Specifically, since your app’s background color is black, your Apple Watch app icon does not appear circular.

Next Steps

Please modify the Apple Watch app icon with a lighter background color to ensure that it is recognizable and appears circular on the Apple Watch.

Now I’m not sure where in the linked HIG page it says anything about app icon background colors and making sure it appears circular, but I do understand the intent of this tester brining this issue up. Since the watch background is black, my icon would appear as an irregular shape, creating nonuniform spacing between the surrounding icons. This is bad for the user and makes the os feel less polished. So I understand why this was flagged, happily changed the icon and resubmitted.

What I would like to see from Apple, and I know I’m not the first to ask for this, but coming from console development I’d love to see cert requirement list. Yes, Apple does publish some guidelines, but they’re vague and incomplete. For instance, 2.1 (Apps that crash will be rejected) and 2.2 (Apps that exhibit bugs will be rejected). When was the last time an app you used crashed or had a bug? How does Apple test these? Are there steps that me as a developer can do to reproduce their test scenario? Console cert docs are typically an Excel sheet that that have detailed explanations as to what they’re testing against. Sure, it cost some money to go through the cert processes, but at least you know what you’re getting into.

I don’t assume Apple will change much in the way the do their testing or in the docs they make available to developers. That being said, I hope over time the process improves.

Dandelion Breeze and What’s Next

Dandelion Breeze has been submitted and is waiting for a review. In the meantime I’m off on to learn new things. Working on Dandelion Breeze I didn’t get to really play with much of the Apple Watch. Since I knew I wasn’t going to have one for development I tried to keep what I did with WatchKit to a minimum. Now with a bit of experience and my Watch so to be in hand I wanted to do a bit more with it. The next things on my list to learn are glances and notifications. Glances are interesting as they provide the user a quick look at pertinent information. It’s almost like a notification except instead of asking the user for a moment of their time, they’re giving you their some of there’s for a snippet of information. You’ve made a contract with the user, for a bit of their time, they’re trusting you to give them the information they’re looking for that can be delivered within a few seconds. Because of the short time scale in which the user is interacting with the glance, and the contract you’ve made to the user, it’s critical that glances be a condensed form of the most important information you want to show your user.

So far I like trying to build things for the Apple Watch. The shorter interaction times and the lack of some controls provide a challenging environment to try and make useful and interesting apps.

I’m felling really motivated to make new things, but the NBA playoffs are on, so my focus might not be what it usually is.

Learning by Doing

I’m a big fan of learning tests and learning by doing. The primary reason I made Dandelion Breeze was to learn the WatchKit SDK, Swift, and SpriteKit. It was definitely fun and informative making this app, and in doing so I had always had the intent to release it completely for free. No ads, no in app purchases (for now, might make a tip jar later depending on number of downloads) and no initial cost. It’s a learning test that is going to be released.

David Smith, of Developing Perspective recently talked about making a Simon Says type game (Episode 214 : App Store Snowstorm). In this episode he talks about “feeling like a special snowflake” which is this idea that even though you feel like your ideas are new and novel, “you are a snowflake in the midst of the App Store snowstorm.” To paraphrase an excerpt,

When you’re working, it’s easy to get motivated into thinking that what you’re doing is new and unique. More often than not, what you’re doing is in some way derivative. If you’re making something that gets its market viability via its uniqueness then you might be in trouble.

His app Pedometer++ is not a unique idea, it’s the first learning test you could make with the Apple M7 motion coprocessor and really, it was his learning test for the SDK (check out Episode 146 : Get Up, Get Moving for more). The app when it launched had its fair share of copycats. Being first does not make you the best, being distinctive helps, but there’s no guarantee that you will gain any traction in the app store with it. Something that helps with finding your distinctiveness is when something new comes out, like a new SDK is to experiment with it. Experimentation is how we learn. It lets us have fun while allowing us to develop our skill sets. Like he says, “if anything, both patience and persistence as a concept, and willingness to experiment are the two biggest things you need to have to be an successful independent developer. And to have tolerance for failure.” Part of learning by doing is a willingness to fail over and over again until you’ve learned the concept you want and continue to experiment to further build your understanding of the topic. I think it’s important to have fun when learning, it shouldn’t be painful.

If you want to learn something go get your hands dirty and build. Build anything really, doesn’t matter how small, just go make things. Tutorials and reading is a fine way to start or research that one component you need, but making learning tests, isolating the thing you’re learning from larger systems provides a distraction free environment to experiment. Dandelion Breeze gave me a platform to make an Apple Watch Extension app. Super simple, just a button that when pressed plays an animation. Informs the group presences of the tap and plays another animation. Repeat. Simple, yes, but it taught me about device image caching, watch layout and group containers. The phone component to the app is a series of sprites moving across the screen waiting to be tapped. With it I learned some basic Swift, SpriteKit, and some minor physics stuff. In the end I have a two learning tests that have come together to make something I can ship and that I’m proud of.

David Smith actually reinforced much of my thinking along the lines of making sure you’re having fun in building your apps or whatever you want to build. In Episode 154 : Something You Are Proud Of, his tag line is “It matters that you can make a living — but it matters more how you make your living.” I love this line, and it really drives my want to become an independent developer, working from home (or coworking space). Getting paid to learn by doing is my dream. Developing new apps and new experiences all while being able to support myself would be amazing. Until then, I’ll continue to make games at a proper job, and use my nights and weekends to learn and play. Experiment. Learn. Play. It’s how to get better at anything.

Dandelion Breeze – Watch Extension

I don’t write here anymore. You can now find me at coderchrismills.com.

Last post I talked a bit about my new app Dandelion Breeze at a high level. I’m going spend this post going over what it has been like to develop for the Apple Watch. This is an incredibly simple app, but it’s been informative enough to help me plan for future Apple Watch apps. First off, pretty screenshots.

Screenshot0

Screenshot3

Coming from iOS and UIKit the watch was familiar, but is many ways different. For instance, rather than having UIImage and UIView, the watch’s counterparts are WKInterfaceImage and WKInterfaceGroup. The first adjustment I had to make was having no way to layer views on top of each other. Two images can’t occupy the same space. You can use the image property of the WKInterfaceGroup to simulate this, but you will not have anywhere near the control you might be used to with UIImageView and UIKit. In the screenshot above the gradient is an image that is on the group and the dandelion is a WKInterfaceImage within that group. A great (quick, dirty, and a bit crazy) use of groups and images is this Equalizer.

The next thing I was not really ready for was no [UIView animate] methods. You can use [InstanceOf_WKInterfaceGroup startAnimating] to animate the currently set image sequence, but you can’t say animate the size of the group, or a button within that group. WatchKit is separate from UIKit, though it has similarities, it’s its own monster. I actually didn’t realize how often I was calling [UIView animateWithDuration…] in my code till I didn’t have access to it. So often you want subtle animations to give your UI life, but with the watch they’ll need to be image sequences.

This segues into the last item that I wanted to discus, the device image cache. Within your watch app you have two main places to put your content, in the watch app bundle and in the device image cache. If you know the assets at build time, say a series of images in an asset catalog, then you would package those into the app bundle. You have 50 mb of storage at the time of this writing for all your stuff, not much, but considering you’re not building large experiences for the watch it should be enough for now, plus, limitations can be fun. The purpose of the cache is for images that you don’t know at build time. Remember, no [UIView animate] methods, so what if we want to animate a graph of data or visualize something else with some motion? This is where the cache comes in. With the image cache we have 5mb of storage and and put our generated image sequences there via kvp. A small project can be found on my GitHub page. This demonstrates how to use the cache and animate images via this cache.

5mb is not a lot of storage and it goes by quick, also, more importantly you will need to wait for the Bluetooth transfer to finish. In the code sample, after the call to [device addCachedImage], the image is “available”, but it needs to wait till the transfer is finished before it will display. If you try to display an image not yet transferred you will see a white spinning indicator letting you know it’s not ready. The transfer time can be slow, so don’t expect this to be viable for sequences that are a mb or more in size. Small data, procedurally generated, transferred and access with kvp, works pretty well.

I’m making a new app

I don’t write here anymore. You can now find me at coderchrismills.com.

When the Apple Watch was announced I decided I wanted to take a break from my regular at home coding projects and develop something for it. Around this time I had also decided that I wanted to learn some Swift and was hoping to go for the two-birds-one-stone thing. Given juggling my normal work day with my new love of gardening it’s taken longer than I had hoped to get it ready, but it’s been awesome nonetheless.

The app is called Dandelion Breeze and it’s a rather simple little “game.” I use the term game loosely as there’s no winning, losing, leaderboards, or anything that would really make it a game. I’ve been calling it an experience when talking to friends about it as game implies there is a directive of some kind, and app sounds like it should have a point to it too. The iPhone and iPad version involve the you popping little frozen dandelions as they float across the screen. I was hoping the phone and tablet, which infer longer periods of interaction, become like a long cold winter and the watch would be spring and have more life. As I said, I have a new love of gardening and living in Oregon our spring though cold and we initial is awesome when you get to see the plants start to grow. We go through this what feels like long fall and winter, only to get a few months of warm beautiful weather. The phone is that long winter. Though it is cold and quite, it can also be incredibly relaxing. The watch with its shorting periods of interaction is our quick spring and summer. Here’s some screenshots from the phone, I’ll be posting some for the watch maybe tomorrow or Thursday and with them I’ll be discussing my impressions of watch development and getting used to Swift.

Screenshot0

Screenshot1

Screenshot3

Screenshot2

G+ Redesign Polish

I don’t write here anymore. You can now find me at coderchrismills.com.

After watching the I/O keynote I decided to head over to Plus and check out the new layout. Noticed that the color of the top search bar area is white rather than the normal grey. I say normal grey as it’s the same grey used in Gmail and in the Play Store. It’s always dark gray nav bar, followed by a lighter grey search bar area.

Standard G+ top bar

Standard G+ top bar

Modified white G+ top bar

Modified white G+ top bar

Standard Gmail top bar

Standard Gmail top bar

After some minor tweaks the search bar now feels more like the standard and the G+ toolbar feels more like a part of the service. It pops more and helps define the content area for the posts. It also now feels more inline with the hangouts area which is also white. The secondary impact of this change occurs when you scroll down. This first gif shows how it is currently implemented.

Standard G+ top bar animation

Standard G+ top bar animation

Here, the grey pops in to white, a drop shadow appears, and some new UI elements animate in. This color change is a bit distracting and if the toolbar was white to begin with is unneeded. The shadow, not too distracting could also be added to the hangouts area as well as the tool bar at all times. I’m not entirely sure why they even need a drop shadow during this scroll transition. It may help to define the content area, but that can be done with the same thin border they use in other services like Gmail. It moves the UI away from a purely flat UI and adds depth where there was none. I’m not saying drop shadows are bad, just in this case it might be misused. I decided to try it with and without the drop shadow to see how distracting it really was.

Modified G+ toolbar animation, white without shadow

Modified G+ toolbar animation, white without shadow

Modified G+ top bar animation, white with shadow

Modified G+ top bar animation, white with shadow

The last thing that jumps out in these animations is the notification icon. It’s a small thing, but it animates in, but does not animate out, merely disappears. It’s a bit of polish that was missed. However, for me, what has been seen cannot be unseen.

I really like the new G+ layout and the new great features. It really does feel like Google is trying to make a unified design across all of their services and the consistency which is great, only highlights missed attention to detail.

Playing User Audio with iOS

I don’t write here anymore. You can now find me at coderchrismills.com.

A few months back someone asked me show them how to play audio from a users music library on iOS. I took this opportunity to demo music analysis program that covers a few things in one project.

  • How to use blocks and NSOperationQueue to load data
  • How to use MPMediaPickerController
  • Simple audio analysis with vDSP

I was really hoping to write a more tutorial style post on how to do all this, but sadly I'm not going to have the time. That being said there's a few code snippets that I think will help people a lot if they're doing something similar. The code here can be a bit touch to read with the way the formatting is done, but it’s all available on GitHub.

First up, loading music using the MPMediaPickerController. In the app I’m using a button open picker modal

- (IBAction)showMediaPicker:(id)sender
{
    MPMediaPickerController *mediaPicker = [[MPMediaPickerController alloc] initWithMediaTypes: MPMediaTypeAny];	
    mediaPicker.delegate = self;
    mediaPicker.allowsPickingMultipleItems = NO;
    mediaPicker.prompt = @"Select song to analyze";
    [self presentModalViewController:mediaPicker animated:YES];
    mediaPicker = nil;
}

So now we need to respond to the user selecting a song. To do this we need to implement

- (void) mediaPicker: (MPMediaPickerController *) mediaPicker didPickMediaItems: (MPMediaItemCollection *) mediaItemCollection

We’ll also need to define an MPMusicPlayerController in our .h file since we’re going to also be handling pausing and playing of the selected track.

MPMusicPlayerController	*musicPlayer; // In the header file
- (void) mediaPicker: (MPMediaPickerController *) mediaPicker didPickMediaItems: (MPMediaItemCollection *) mediaItemCollection
{
    if (mediaItemCollection) {
		NSArray *songs = [mediaItemCollection items];
        [musicPlayer setQueueWithItemCollection: mediaItemCollection];
		
		[loadingView setHidden:NO];
		[activityIndicator setHidden:NO];
		[activityIndicator startAnimating];
		
		NSOperationQueue *queue = [[NSOperationQueue alloc] init];
		[queue addOperationWithBlock:^{
			[self loadSongData:(MPMediaItem *)[songs objectAtIndex:0]];
			[[NSOperationQueue mainQueue] addOperationWithBlock:^{
				[activityIndicator stopAnimating];
				[activityIndicator setHidden:YES];
				[loadingView setHidden:YES];
				
				// Start the timer to sample our audio data. 
				float interval_rate = ((songLength / songSampleRate) / MAX_FREQUENCY);
				[self setAnalysisTimer:[NSTimer scheduledTimerWithTimeInterval:interval_rate
                                                                        target:self
                                                                      selector:@selector(doAnalysis:)
                                                                      userInfo:nil
                                                                       repeats:YES]];
				[musicPlayer play];
			}];
		}];		
    }
    [self dismissModalViewControllerAnimated: YES];
}

In the above code you’ll notice a function called loadSongData, this is the meat of loading the song data. I’ve removed a bit of the full function which you can find on the GitHub page.

- (void)loadSongData:(MPMediaItem *)mediaitem
{

	NSURL* url = [mediaitem valueForProperty:MPMediaItemPropertyAssetURL];
	AVURLAsset * asset = [[AVURLAsset alloc] initWithURL:url options:nil];

	NSError * error = nil;
	AVAssetReader * reader = [[AVAssetReader alloc] initWithAsset:asset error:&error];

	AVAssetTrack * songTrack = [asset.tracks objectAtIndex:0];

	NSArray* trackDescriptions = songTrack.formatDescriptions;

	numChannels = 2;
	for(unsigned int i = 0; i < [trackDescriptions count]; ++i) {
		CMAudioFormatDescriptionRef item = (__bridge CMAudioFormatDescriptionRef)[trackDescriptions objectAtIndex:i];
		const AudioStreamBasicDescription* bobTheDesc = CMAudioFormatDescriptionGetStreamBasicDescription (item);
		if(bobTheDesc && bobTheDesc->mChannelsPerFrame == 1) {
			numChannels = 1;
		}
	}
	
	const AudioFormatListItem * afli = CMAudioFormatDescriptionGetRichestDecodableFormat((__bridge CMAudioFormatDescriptionRef)[trackDescriptions objectAtIndex:0]);
	songSampleRate = afli->mASBD.mSampleRate;

	DebugLog(@"%f", afli->mASBD.mSampleRate);

	NSDictionary* outputSettingsDict = [[NSDictionary alloc] initWithObjectsAndKeys:
										
										[NSNumber numberWithInt:kAudioFormatLinearPCM],AVFormatIDKey,
										[NSNumber numberWithInt:16],AVLinearPCMBitDepthKey,
										[NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
										[NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
										[NSNumber numberWithBool:NO],AVLinearPCMIsNonInterleaved,
										
										nil];

	AVAssetReaderTrackOutput * output = [[AVAssetReaderTrackOutput alloc] initWithTrack:songTrack outputSettings:outputSettingsDict];
	[reader addOutput:output];
	
	output = nil;
	
	NSMutableData * fullSongData = [[NSMutableData alloc] init];
	[reader startReading];

	while (reader.status == AVAssetReaderStatusReading){
		AVAssetReaderTrackOutput * trackOutput = (AVAssetReaderTrackOutput *)[reader.outputs objectAtIndex:0];
		CMSampleBufferRef sampleBufferRef = [trackOutput copyNextSampleBuffer];
		
		if (sampleBufferRef){
			CMBlockBufferRef blockBufferRef = CMSampleBufferGetDataBuffer(sampleBufferRef);
			
			size_t length = CMBlockBufferGetDataLength(blockBufferRef);
			
			UInt8 buffer[length];
			CMBlockBufferCopyDataBytes(blockBufferRef, 0, length, buffer);
			
			NSData * data = [[NSData alloc] initWithBytes:buffer length:length];
			[fullSongData appendData:data];
			
			CMSampleBufferInvalidate(sampleBufferRef);
			CFRelease(sampleBufferRef);
			
			data = nil;
		}
	}

	[self setSongData:nil];

	if (reader.status == AVAssetReaderStatusFailed || reader.status == AVAssetReaderStatusUnknown){
		DebugLog(@"Something went wrong...");
		return;
	}

	if (reader.status == AVAssetReaderStatusCompleted){
		[self setSongData:[NSData dataWithData:fullSongData]];
	}
	
	fullSongData		= nil;
	reader				= nil;
	asset				= nil;
	outputSettingsDict	= nil;

	
	songLength = [[self songData] length] / sizeof(SInt16);
	songWaveform = (SInt16 *)[[self songData] bytes];
	
	DebugLog(@"Data length %lu", songLength);
	
	
	UIImage *waveimage = [self audioImageGraphFromRawData:songWaveform withSampleSize:songLength];
	if(waveimage != nil)
	{
		[fullWaveformImageViewLandscape setImage:waveimage];
		[fullWaveformImageViewPortrait setImage:waveimage];
	}
	[self setFullWaveformImage:waveimage];
	
	nLastPlayIndex		= 0;
}

Alright, now the song data is ready to use and we can play the song with [musicPlayer play]. So far we’ve gone over using blocks to load a song without locking up the UI and selecting songs from your music library. Only thing left to quickly go over is the vDSP library. It’s actually the simplest thing you can imagine as the Accelerate framework in my opinion is very easy to use. Here’s the FFT function I’m using in the app,

- (void)computeFFT
{
    // Only works iOS 4 and above 
    for (NSUInteger i = 0; i < MAX_FREQUENCY; i++)
    {
        input.realp[i] = (double)fftWaveform[i];
        input.imagp[i] = 0.0f;
    }
	
    /* 1D in-place complex FFT */
    vDSP_fft_zipD(fft_weights, &input, 1, 6, FFT_FORWARD);  
	
    input.realp[0] = 0.0;
    input.imagp[0] = 0.0;
}

Before you can use this function you need to create the weights, in my case I'm doing this in the viewDidLoad function.

fft_weights = vDSP_create_fftsetupD(6, kFFTRadix2); // 6 is the base 2 exponent.

Well, that's about it. There's a lot more in the code sample and I hope you check it out. The last thing I'd like to mention is that this app also draws both the waveform and spectrum of the audio being played. The majority of the code that draws these is from this StackOverflow post which also goes over a lot of what I've written here.

Making a PDF from a UIWebView – Updated

I don’t write here anymore. You can now find me at coderchrismills.com.

For those coming to the site for the Making a PDF from a UIWebView post, I’ve updated the project on GitHub to support pre iOS 5 targets. The project was originally setup using Storyboards which are only available in iOS 5. The project now uses xib’s and is functionally identical.

New Game Conference 2011

I don’t write here anymore. You can now find me at coderchrismills.com.

Last week I was fortunate enough to be able to go to the New Game Conference in San Francisco despite being extremely busy at work and simply put, it was awesome. As expected there were a lot of web developers, but there were more console / pc game developers than I had anticipated. I have to give a huge thanks to +Seth Ladd and +Darius Kazemi for putting together such an amazing event and to all the presenters, there really wasn’t a single talk I didn’t get something out of. If you’re looking for slides, code and talk specific write ups check out ConfSwag. While the range of talks was pretty diverse, there were a few similar threads,

  • We all love Web Audio API, but it still needs a bit of work and it definitely needs support in all browsers
  • Developers still want to make games in Java and C++, then cross compile to HTML/JS (e.g PlayN, Mandreel).
  • Beware of the Garbage Collector, it can cause a number of performance issues
  • WebGL and modern browsers can provide PS2 quality games, but the web as a game platform might not be ready for prime time.

On the last point, it was mentioned more than once that even though it might not be ready, it’s evolving so fast they we should continue to push the platform and be developing those games now. Richard Hilleman’s keynote touched on this, saying there needs to be that killer game, similar to what Halo was for the Xbox.

Chrome seemed to be the best target for that killer game which a rich set of developer tools, features and an install base is over 200 million. However, it was either Paul Bakaus or Grant Skinner who reminded everyone that even though a users browser can support features like WebGL, doesn’t mean their machine can. I ran into this very issue when I attempted to show a coworker some of the demos from the conference only to receive and error telling me their graphics card was unable to play it. This may seem a little ridiculous, we have to remember that unlike the console market, we have no idea what device our users are trying to play our games on. A running joke was that for all we know they’re playing their games on a refrigerator.

Overall the quality of the sessions were very high and I hope at some point videos will be released. There were a few talks that really stood out though and tomorrow I’ll be writing about my top five and some of the general amazing things shown. It may be because of the intimacy of the conference (around 200-250 people), but I enjoyed it as much as GDC, if not more.