This tutorial started as a test job for a unreliable customer that came to nothing. As I was not paid for my time and developing effort, I have decided to release it as a tutorial, so it can at least benefit the developers’ community. Note for developers: never, ever, ever, accept implementing a “test app” for a customer without asking for a fair payment for your time, no matter how alluring the offer may seem.

Many times, developers are faced with the challenge of implementing an App that would retrieve some kind of information from a 3rd party RESTful API and then show this information in a way that will allow the user to interact with the data, possibly resulting in further request to the RESTful APIs. One of the many ways this interaction can happen is via a MKMapView, which makes an interesting situation in which the gestures and touches of the user in the map provoke a on-the-fly change in the information. This is our scenario.

The App, called “Nearby Flickr Photos”, will retrieve some photos from Flickr based on the location of the user and present them in a MKMapView as annotations centered on the visible region. The user then will move through the map, zooming in and out, and swiping to another locations, resulting in the map annotations being updated for that current location. It will work both for iOS 7 and iOS 8.

As always, you can download the code from my Github respository and use it for your personal or commercial projects.

Initial setup

twoVCWe’ll start by creating a single view application project in XCode 6. The application will be called “NearbyFlickrPhotos”, and will target iOS 7.1 and up for iPhone only. First, we will setup the view controllers for our App in our storyboard. We will need two view controllers, the main one for showing the Flickr photos in a map and the second one for displaying a concrete photo in detail.

In our main view controller, we will choose a MKMapView and set its layout to full screen (setting top, bottom, leading and trailing space to container view to zero in the constraints editor), and we will set our VC as the delegate of the map. That’s all we need for the main VC.

Then, we will insert our second view controller. We will add a UIImageView, also taking all space in the container view, and we will also insert a bottom view containing all the information and some action buttons. As we are not targeting iOS 8, we cannot add a UIEffectView with a blur effect (as it is only , so we will just set the view color to white and its opacity to something like 90-95%. We will add two buttons at the right, one for dismissing the VC and another one to share the photo (in case we like one). At the left, we will insert two labels. The top one will show the title of the photo, and the bottom one will show the coordinates. We will set our layout constraints so that the container view has a fixed height of 90 pixels and its pinned at the bottom, the button icons inside stays at its right with fixed sizes, and the labels stay at the left being allowed to grow and shrink in width.

After the initial setup, we will take care of the communication with the RESTful Flickr API.

Communicating with the Flickr backend

In order to communicate with the Flickr REST API, we will implement a class called RESTManager, following the singleton pattern. This unique class will be in charge of retrieving the photos based on a given location and loading the media information asynchronously. For communicating with RESTful interfaces, I usually employ AFNetworking, because it is easy to use and allows you to setup a RESTful client quickly. In order to import it, you must install CocoaPods and setup a pod file. As CocoaPods is a really handy and powerful tool for managing and deploying cocoa modules and 3rd party libraries, I suggest you to spend some time learning its basic, but if you just want to follow this tutorial, you can quickly install and setup everything in a few steps:

  • Close the project and install CocoaPods: First, you need to close the project, because installing AFNetworking as a CocoaPod will create a .workspace environment that we will use to open our project. In other words, once we have installed the pods, we won’t open our project using the .xcodeproj file.
sudo gem install cocoapods
  • Create a PodFile: open your favorite editor (mine is vim) and create a new file in the root directory of the project called “Podfile”. Enter this content for the file, save and exit:
platform :ios, '7.0'
pod "AFNetworking", "~> 2.0"
  • Install the modules specified in the PodFile: open the terminal (⌘-Space, type terminal, push enter), navigate to the project directory and run the pod install command:
pod install
  • Open the workspace and have fun: if everything goes well, the pod will install, generating a .xcworkspace structure that we will open in the future to work in our project. Now just double-click on it and let’s keep on working.


When you open the new .xcworkspace file, you will see two targets, our NearbyFlickrPhotos project, and the Pods target containing AFNetworking. Now we can start implementing our RESTManager. We will start by defining the singleton instance method:

+ (RESTManager *) sharedInstance {
   static dispatch_once_t pred = 0;
   __strong static RESTManager * _sharedObject = nil;
   dispatch_once(&pred, ^{
       _sharedObject = [[self alloc] init];
   return _sharedObject;

Now let’s have a look at the Flickr REST API. Flickr has a single request URL for REST requests, and the client specifies the requested operation by a “method” parameter. The URL is “”. We need a method that will retrieve the photos information for a given location coordinates (latitude and longitude), and that method is “”. As with most 3rd party RESTful backends, you need to open a developer account with Flickr and create an App to be given an identifier to be used when communicating with the backend. In the case of Flickr, you must head to “”, create and App there, and grab the ApiKey value that will be used as a parameter in our requests.

We will define a constant for this ApiKey called “kNearbyFlickrPhotosFlickrAPIKey”. You must set this constant to the string of your ApiKey. We will also define some other constants that are needed for our request, and a method that will put together our final request URL based on a region in a map:

#define kNearbyFlickrPhotosFlickrAPIKey @"INSERT_YOUR_FLICKR_APP_ID_HERE"
#define kNearbyFlickrPhotosFlickrBaseRESTURL @""
#define kNearbyFlickrPhotosFlickrSearchMethod @""
#define kNearbyFlickrPhotosFlickrJSONFormat @"format=json"
#define kNearbyFlickrPhotosFlickrApiKeyParameter @"api_key"
#define kNearbyFlickrPhotosFlickrBoundingBoxParameter @"bbox"
#define kNearbyFlickrPhotosFlickrExtras @"extras=geo,url_t,url_o,url_m" 
#define kNearbyFlickrPhotosFlickrNoJSONCallback @"nojsoncallback=1"
#define kNearbyFlickrPhotosFlickrPerPageParameter @"per_page="
#define kNearbyFlickrPhotosMaxPhotosToRetrieve 100 

- (NSString *) buildFlickrSearchURLFromLocation: (CLLocationCoordinate2D) bottomLeft toLocation: (CLLocationCoordinate2D) topRight {
   return [NSString stringWithFormat:@"%@%@&%@&%@=%@&%@=%f,%f,%f,%f&%@&%@%d&%@", 
       kNearbyFlickrPhotosFlickrBoundingBoxParameter, bottomLeft.longitude, bottomLeft.latitude, topRight.longitude, topRight.latitude,
       kNearbyFlickrPhotosFlickrPerPageParameter, kNearbyFlickrPhotosMaxPhotosToRetrieve,

Let’s analyze all these parameters that form the final request URL:

  • kNearbyFlickrPhotosFlickrJSONFormat specifies that we want the results expressed in JSON format.
  • kNearbyFlickrPhotosFlickrApiKeyParameter is the parameter we will use to specify our ApiKey.
  • kNearbyFlickrPhotosFlickrBoundingBoxParameter will indicate a search area based on the coordinates of two points, the top right corner and the bottom left corner of a square or rectangle. In our case, it will be the visible square are on our MKMapView.
  • kNearbyFlickrPhotosFlickrExtras specifies that we want the geolocation of the photos (geo) and that we want the URLs of the thumbnail, original (big) and medium photo sizes (url_t, url_o and url_m).
  • kNearbyFlickrPhotosFlickrNoJSONCallback is important to ask Flickr not to wrap its response in a non-standard format. If not specified, Flickr will add a wrapping envelope that will ruin any attempt of JSON decoding of the response.
  • kNearbyFlickrPhotosFlickrPerPageParameter specifies the max number of items that the answer must include. The default maximum for Flickr is 250, but we will define a constant called kNearbyFlickrPhotosMaxPhotosToRetrieve to 100 photos.

The buildFlickrSearchURLFromLocation:toLocation: will simply put together all these elements to generate the URL for the GET request to the Flickr backend.

Now we are capable of defining our loadFlickrImagesFromLocation:toLocation:andExecuteBlock: method to request the photos of a region area of our map:

- (void) loadFlickrImagesFromLocation: (CLLocationCoordinate2D) bottomLeft toLocation: (CLLocationCoordinate2D) topRight andExecuteBlock: ( void (^) (BOOL success, NSArray * entries) ) block {
   // build the request, baseURL and parameters
   NSString * baseURL = [self buildFlickrSearchURLFromLocation:bottomLeft toLocation:topRight];
   AFHTTPRequestOperationManager * manager = [[AFHTTPRequestOperationManager alloc] init];
   // execute the request
   [manager GET:baseURL parameters:nil success:^(AFHTTPRequestOperation *operation, id responseObject) {
      // analyze results
      if (responseObject && [responseObject isKindOfClass:[NSDictionary class]]) { // analyze response 
         NSDictionary * responseDict = (NSDictionary *) responseObject;
         NSDictionary * photosDict = responseDict[kNearbyFlickrPhotosResponseParamPhotos];
         if (photosDict && ([photosDict isKindOfClass:[NSDictionary class]])) {
            NSArray * photoArray = photosDict[kNearbyFlickrPhotosResponseParamPhoto];
            if (photoArray && ([photoArray isKindOfClass:[NSArray class]])) {
               block(YES, photoArray);
            } else block(NO, nil);
         } else block(NO, nil);
      } else block(NO, nil);
   } failure:^(AFHTTPRequestOperation *operation, NSError *error) { // invalid request.
      block(NO, nil);

Flickr will return the array of photos inside the element “photos”, as an array called “photo”, as shown below (some other fields hidden for brevity):

photos = {
 page = 1;
 pages = 450725;
 perpage = 100;
 photo = (
    "height_m" = 500;
    "height_o" = 2988;
    "height_t" = 100;
    id = 15392959305;
    latitude = "36.996166";
    longitude = "-121.376359";
    title = "Hollister Renisanse Fair";
    "url_m" = "";
    "url_o" = "";
    "url_t" = "";
    "width_m" = 281;
    "width_o" = 5312;
    "width_t" = 56;
 ... // other photo entries

If we receive a successful answer and we find the structure as we expect it, we return the array of photos calling the completion block. We also need to define a method that will load the image data from the photo URLs. We will call this method loadRemoteImageFromURL:andExecuteBlock:.

- (void) loadRemoteImageFromURL: (NSURL *) url andExecuteBlock: (void (^)(BOOL success, UIImage * image, NSURL * url)) block {
   NSURLRequest * urlRequest = [NSURLRequest requestWithURL:url];
   AFHTTPRequestOperation *requestOperation = [[AFHTTPRequestOperation alloc] initWithRequest:urlRequest];
   requestOperation.responseSerializer = [AFImageResponseSerializer serializer];
   [requestOperation setCompletionBlockWithSuccess:^(AFHTTPRequestOperation *operation, id responseObject) {
      block(YES, responseObject, url);
   } failure:^(AFHTTPRequestOperation *operation, NSError *error) { block(NO, nil, url); }];
   [requestOperation start];

That’s all we need for being able to communicate with the Flickr REST API. Now let’s dive into the main UI element: our map.

Building our interactive map

The main interaction unit of our UI is the map of our main UIViewController. We did set it in IB, and put our VC as its delegate, so let’s implement its functionality. Apart from our mapView property, we need a set of annotations (pins) that will be shown in the map, one for each photo entry, so we define an NSArray called mapAnnotations. We will also manage the user’s current location (just to show the initial location and nearby photos) and a CLLocationManager. As we are only interested in the user’s initial location, and we will not track his/her movements, we will define a boolean property called firstLocationHasBeenRetrieved, that we will use to determine if we need to retrieve the user location and initialize the map, or we should stop querying the device’s location as we already have it.


Prior to iOS 8, a MKMapView alone could retrieve and use the location of the user by means of the instance property showUserLocation. Starting with iOS 8, we need to explicitly request the authorization of the user for accessing the device’s location. In order to do this, we need to define a CLLocationManager instance and call requestWhenInUseAuthorization (if our App will retrieve the location when it’s in foreground) or requestAlwaysAuthorization (if the App is going to get the location even when in background).

We will also need to specify, in our NearbyFlickrPhotos-Info.plist file, the property “NSLocationWhenInUseUsageDescription” or “NSLocationAlwaysUsageDescription”, containing the message that will be shown to the user requesting authorization for use in foreground or always (foreground+background) respectively. In our case, we will call requestWhenInUseAuthorization, as we don’t really need the location of the user if we go to the background. We will indicate a request message like “NearbyFlickrPhotos needs your location for being able to retrieve the closest photos from Flickr.”, and we will make sure we only request it for iOS 8 (as iOS 7 doesn’t implement the method):

if (!firstLocationHasBeenRetrieved) {
   if ([self.locationManager respondsToSelector:@selector(requestWhenInUseAuthorization)])
      [self.locationManager requestWhenInUseAuthorization];
   self.mapView.showsUserLocation = NO;

Notice how we use a preprocessor #if condition and a respondToSelector: check to isolate the authorization request to iOS 8 only. We explicitly set the showsUserLocation to NO because we are going to set the user’s location in the map by means of our CLLocationManager. When we initialize our CLLocationManager and request authorization, it will call it’s delegate method locationManager:didChangeAuthorizationStatus: each time the authorization status changes (i.e: if the user authorizes the use of his/her location). Thus, we would implement CLLocationManagerDelegate in our view controller and define that method as follows:

- (void) locationManager:(CLLocationManager *)manager didChangeAuthorizationStatus:(CLAuthorizationStatus)status {
   [self closeLoadingAlert];
   if (status == kCLAuthorizationStatusAuthorized || status == kCLAuthorizationStatusAuthorizedWhenInUse) { 
      // we got authorized.
      [self.locationManager startUpdatingLocation];
      self.mapView.showsUserLocation = YES;
   } else if (status == kCLAuthorizationStatusRestricted) { // Unable to access location
      [self showAlertWithMessage:@"Unable to retrieve your location..." isError:YES];
   } else if (status == kCLAuthorizationStatusDenied) { // Location not authorized by the user.
      [self showAlertWithMessage:@"You must authorize access to your location to NearbyFlickrPhotos..." isError:YES];

We will start updating our location only if we get authorized by the user. We call our locationManager startUpdatingLocation and also ask our map to show the user’s location, thus assuring that we will retrieve the device’s location one way or another. The first case is handled by the CLLocationManagerDelegate method locationManager:didUpdateLocations:, while the second one gets handled by the MKMapViewDelegate method mapView:didUpdateUserLocation:. Both exhibit the same behavior. They check if the location of the user has already been retrieved and, if not, they store this initial location and stop further location retrieval (both from the map and from the CLLocationManager), calling the method updateFlickrImagesInMap to start retrieving the photos for the current location:

- (void)mapView:(MKMapView *)mapView didUpdateUserLocation:(MKUserLocation *)userLocation {
   self.userLocation = userLocation.coordinate;
   [self.mapView setCenterCoordinate:userLocation.coordinate animated:YES];
   if (!firstLocationHasBeenRetrieved) {
      firstLocationHasBeenRetrieved = YES;
      self.mapView.showsUserLocation = NO;
      self.mapView.userTrackingMode = MKUserTrackingModeNone;

      // calculate first nearby photos
      [self updateFlickrImagesInMap];

Thus, when we call updateFlickrImagesInMap, our mapView has been initialized with a proper location, and will adjust to a certain region. We will extract this region’s coordinates and call our RESTManager method loadRemoteImageFromURL:andExecuteBlock: in the method updateFlickrImagesInMap.

- (void) updateFlickrImagesInMap {
   CLLocationCoordinate2D bottomLeft = [self getBottomLeftCornerOfMap];
   CLLocationCoordinate2D topRight = [self getTopRightCornerOfMap];
   [[RESTManager sharedInstance] loadFlickrImagesFromLocation:bottomLeft toLocation:topRight 
      andExecuteBlock:^(BOOL success, NSArray *entries) {
      if (success) {
         [self generateMapAnnotationsForEntries:entries];
      } else self.mapAnnotations = @[];

If the request is successful, we will call generateMapAnnotationsForEntries:, that will generate the MapAnnotations for the retrieved entries and set them as the map annotations. The MKMapView manages an array of annotations, which are shown as the pins in the map. In order to display the pins over the map, you need to create an array of such annotations and assign them to the map by calling addAnnotations: and passing the array. These annotation objects must be

In order to put a set of annotations in a map, it is always a good practice to create a class that implements MKAnnotation. This class will contain properties for all the relevant information that needs to be displayed. In our case, we will define a class called FlickrPhotoAnnotation, that will implement MKAnnotation, and will contain properties for all the information from a Flickr photo entity, including the photo name, thumbnail and big images, and the coordinates:

@property (nonatomic, strong) UIImage * cachedBigImage; // cached image, original (big) version
@property (nonatomic, strong) UIImage * cachedThumbnailImage; // cached image, thumbnail
@property (nonatomic, strong) NSString * imageTitle; // The name(title) of the Flickr Photo (if any)
@property (nonatomic, strong) NSString * bigImageURL; // original (big) image data
@property (nonatomic, strong) NSString * thumbnailImageURL; // thumbnail image data.

Every instantiation of a FlickrPhotoAnnotation through the initWithValuesFromDictionary: method will receive a NSDictionary containing a “photo” entry from the server response (see above), and will setup all the properties, calling the RESTManager method loadRemoteImageFromURL:andExecuteBlock: to retrieve the thumbnail image that we’ll show in the map pin annotation view. We will not retrieve the original (big image) unless a user selects the image.

❤️ Enjoying this post so far?

If you find this content useful, consider showing your appreciation by buying me a coffee using the button below 👇.

Buy me a coffeeBuy me a coffee
- (id) initWithValuesFromDictionary:(NSDictionary *)dictionary {
   self = [super init];
   if (self) { // fill Flickr Photo Annotation parameters from REST API response
      // Thumbnail image
      if (dictionary[kNearbyFlickrPhotosDictionaryParameterThumbnailURL]) {
         self.thumbnailImageURL = dictionary[kNearbyFlickrPhotosDictionaryParameterThumbnailURL];
         [[RESTManager sharedInstance] loadRemoteImageFromURL:[NSURL URLWithString:self.thumbnailImageURL] 
            andExecuteBlock:^(BOOL success, UIImage *image, NSURL *url) {
               if (success) { self.cachedThumbnailImage = image; }
      if (!self.cachedThumbnailImage) self.cachedThumbnailImage = [UIImage imageNamed:@"unknownImage"];
      // Big image
      self.bigImageURL = dictionary[kNearbyFlickrPhotosDictionaryParameterBigURL];
      // Location
      _coordinate = CLLocationCoordinate2DMake([dictionary[kNearbyFlickrPhotosDictionaryParameterLatitude] floatValue], 
                    [dictionary[kNearbyFlickrPhotosDictionaryParameterLongitude] floatValue]);

      // Title
      if (dictionary[kNearbyFlickrPhotosDictionaryParameterTitle]) {
         self.imageTitle = dictionary[kNearbyFlickrPhotosDictionaryParameterTitle];
      } else self.imageTitle = @"Unknown Photo";
   return self;

Now that we have our annotations ready, we can define the generateMapAnnotationsForEntries: method.

- (void) generateMapAnnotationsForEntries: (NSArray *) entries {
   NSMutableArray * newMapAnnotations = [NSMutableArray arrayWithCapacity:entries.count];
   for (NSDictionary * entry in entries) {
      FlickrPhotoAnnotation * fpa = [[FlickrPhotoAnnotation alloc] initWithValuesFromDictionary: entry];
      if (fpa) [newMapAnnotations addObject:fpa];
   self.mapAnnotations = [newMapAnnotations copy];
   [self.mapView setNeedsDisplay];

We define the setter method for mapAnnotations to remove all previous annotations (with removeAnnotations:) before adding the new ones. Annotations on a map are displayed thanks to the MKMapViewDelegate method mapView:viewForAnnotation:. In this method, we will return the view that will be shown when the user touches a pin in the map. This view is an instance of MKPinAnnotationView. If we want to add a button the user can select to interact with the photo entry (like we want to), we need to set a special property of this MKPinAnnotationView called leftCalloutAccessoryView, that defines the view at the left of the annotation view. We can set this view to a UIButton that will trigger the transition that shows the photo:

- (MKAnnotationView *)mapView:(MKMapView *)mapView viewForAnnotation:(id <MKAnnotation>)annotation {
   // return user's location default blue dot if annotation is user's location
   if (annotation == mapView.userLocation) return nil;
   // else show flickr photo annotation
   MKPinAnnotationView * mkav = nil; 
   if (!mkav) {
      mkav = [[MKPinAnnotationView alloc] initWithAnnotation:annotation reuseIdentifier: nil];
      mkav.canShowCallout = YES;
      mkav.enabled = YES;
      mkav.leftCalloutAccessoryView = [[UIButton alloc] initWithFrame:CGRectMake(0, 0, 30, 30)];
      UIButton * entryButton = (UIButton *) mkav.leftCalloutAccessoryView;
      [entryButton.imageView setContentMode:UIViewContentModeScaleAspectFit];
   mkav.annotation = annotation;
   return mkav;

We must take into account the special case of the annotation for the user location, which happens when the requested annotation is our mapView.userLocation. In this case we return nil so the system will create the default view, which is the blue pulsing dot. In any other case, we will create a new MKPinAnnotationView with the photo title, and define a UIButton as leftCalloutAccessoryView (we could have defined the right one, but I think the image at the left looks nicer) with the entry photo. Each of these MKPinAnnotationView will be associated with one of our annotations containing the data of the Flickr photo entry:


Additionally, we will define the MKMapViewDelegate methods mapView:didSelectAnnotationView: and mapView:didDeselectAnnotationView:. The first one will load the cached thumbnail image retrieved from Flickr so it can be shown when mapView:viewForAnnotation: gets called. The second one clears this UIButton image to save memory.

Now, when the user taps this button (the image), the MKMapViewDelegate will receive mapView:annotationView:calloutAccessoryControlTapped:. In this method, we will segue to our PhotoDetailViewController, where we will show a full screen version of the photo to the user. As we only retrieved the thumbnail pic for every photo when we created the annotation, we will need to first retrieve the original, big sized image. If we succeed, we will set our selectedFlickrPhoto property with this annotation and perform the segue:

- (void)mapView:(MKMapView *)mapView annotationView:(MKAnnotationView *)view calloutAccessoryControlTapped:(UIControl *)control {
   NSString * originalPhotoURL = nil;
   if ([view.annotation isKindOfClass:[FlickrPhotoAnnotation class]]) {
      originalPhotoURL = [(FlickrPhotoAnnotation *) view.annotation bigImageURL];
   if (originalPhotoURL) { // If we do have a photo, try to download and segue to show it.
      [self showLoadingAlert];
      [[RESTManager sharedInstance] loadRemoteImageFromURL:[NSURL URLWithString:originalPhotoURL] 
                                    andExecuteBlock:^(BOOL success, UIImage *image, NSURL *url) {
         dispatch_async(dispatch_get_main_queue(), ^{ // update UX/UI only in main thread
            if (success) {
               FlickrPhotoAnnotation * ann = (FlickrPhotoAnnotation *) view.annotation;
               ann.cachedBigImage = image;
               self.selectedFlickrPhoto = ann;
               [self closeLoadingAlert];
               [self performSegueWithIdentifier:kNearbyFlickrShowPhotoInDetailSegue sender:nil];
            } else {
               [self closeLoadingAlert];
               [self showAlertWithMessage:@"Error loading image from Flickr" isError:YES];
   } else [self showAlertWithMessage:@"Unable to load image from Flickr" isError:YES];

Now on the prepareForSegue: method we will set our destination view controller photo information:

- (void) prepareForSegue:(UIStoryboardSegue *)segue sender:(id)sender {
   if ([segue.identifier isEqualToString:kNearbyFlickrShowPhotoInDetailSegue]) {
      PhotoDetailViewController * pdvc = (PhotoDetailViewController *) segue.destinationViewController;
      pdvc.imageToShowInDetail = self.selectedFlickrPhoto.cachedBigImage ? self.selectedFlickrPhoto.cachedBigImage 
                                 : self.selectedFlickrPhoto.cachedThumbnailImage;
      pdvc.photoTitle = self.selectedFlickrPhoto.title;
      pdvc.photoCoordinate = self.selectedFlickrPhoto.coordinate;

Last, but not least, we need to react to the user navigating through the map. We will do this thanks to the MKMapViewDelegate method mapView:regionDidChangeAnimated:, that gets called when the map region changes, such as when the user swipes through it, zooms in or out, or moves to a different location. We will simply call our updateFlickrImagesInMap method.

- (void)mapView:(MKMapView *)mapView regionDidChangeAnimated:(BOOL)animated {
   [self updateFlickrImagesInMap];

Showing the photo in detail

showAndShareOur second view controller, PhotoDetailViewController, will receive the photo (original) image, title and coordinate, and just update the outlets. The goBack button will simply dismiss the view controller, and its implementation is pretty straightforward. The only interesting bit here is the share button, that will use a UIActivityViewController to share the photo to social networks.

- (IBAction)sharePhoto:(id)sender {
   if (!self.imageToShowInDetail) return;
   // share photo
   NSString *textToShare = ...;
   UIImage *imageToShare = self.imageToShowInDetail;
   NSURL * urlToShare = ...;
   NSArray *itemsToShare = @[textToShare, imageToShare, urlToShare];
   UIActivityViewController *activityVC = [[UIActivityViewController alloc] initWithActivityItems:itemsToShare applicationActivities:nil];
   activityVC.excludedActivityTypes = @[UIActivityTypePrint, UIActivityTypeCopyToPasteboard, UIActivityTypeAssignToContact, UIActivityTypeAddToReadingList, UIActivityTypeSaveToCameraRoll];
   [activityVC setValue: @"I would like to share a Flickr pic with you!" forKey:@"subject"];
   activityVC.completionHandler = ^(NSString *activityType, BOOL completed) {
      if (completed) [self showAlertWithMessage:[NSString stringWithFormat:@"Flickr photo %@ shared", self.photoTitle] isError:NO];
      else [self showAlertWithMessage:[NSString stringWithFormat:@"Flickr Photo %@ was not shared", self.photoTitle] isError:YES];
   [self presentViewController:activityVC animated:YES completion:nil];

And that’s all. You can download the full project source code from my Github respository. If you have any questions or comments, please let me know.