Training

Current Available Classes

Training

Note: All training courses are currently offered as a one day separate class. Tickets are only valid for the day(s) you registered for. Registration for these courses can be purchases as part of a conference ticket and can also be purchased ala cart.


How to register for Training classes at IOTAConf 2014:

We have a drop down which allows you to select whom you would like train with. If you are purchasing more than one training ticket you can keep using the ala carte training option to add training for all three days.

For questions or if you would like to instruct contact us at classes@iotaconf.com






Nuts and Bolts of WebSocket

Instructor: Arun Gupta, Red Hat & Jean Francois


WebSocket provides a rich and powerful communication channel where client and server can communicate with each other in a standard way. JavaScript API in Web browsers is also quite prevalent making it easier to support it.

Would you like to learn how to build WebSocket applications in Java ?
How and where do you deploy these applications ?
What are the concerns around firewall, DNS and routers ?
How do you debug message exchanges ?
What to do if WebSocket is not supported in the browser or app server ?
Debugging and Production tips ?
Learn about phantom WebSockets.

This uinversity session is for you! Here is what we'll cover:

  • Introduction to WebSocket
  • WebSocket using JSR 356
  • WebSocket using Undertow/WildFly
  • WebSocket using Atmosphere
  • Surviving Firewall and Proxy
  • WebSocket Debugging
  • WebSocket Production Tips


Course Level: Beginner

Prerequisites:

This workshop will be based upon real-life deployment experience of WebSockets.

Objectives:

  • Introduction to WebSocket
  • WebSocket using JSR 356
  • WebSocket using Undertow/WildFly
  • WebSocket using Atmosphere
  • Surviving Firewall and Proxy
  • WebSocket Debugging
  • WebSocket Production Tips

Register Now


Big Data Bootcamp for devices

Instructor: Santosh Jha, Aziksa



Hadoop and Big Data Fundamentals

This course will begin with the big data motivation and explaining all the components in Hadoop including hadoop cluster and distributed file systems. How to use HDFS distributes storage. What is map reduce and its effect in distributed processing. Write a count program in java and explore map reduce further in python.

  • Module 01 - Big Data. Motivation, Hadoop components
  • Module 02 - Using the Hadoop HDFS Distributed Storage
  • Module 03 - Distributed processing Map Reduce
  • Module 04 - Word count java program in Map Reduce
  • Module 05 - A Better Word count program
  • Module 06 - Map Reduce and other languages (a simple example in python)

Hive - A SQL like Programming language for Big Data

Hive, allows SQL developers to write Hive Query Language (HQL) statements that are similar to SQL statements; For anyone with a SQL or relational database background, this section will look very familiar. Course will cover basic concepts, joins, partitions, bucketing, external tables etc.

  • Module 01. Hive - Basic Concepts
  • Module 02. Hive - Joins
  • Module 03 Hive - Partitions
  • Module 04 Hive - Bucketing and external tables
  • Module 05 Hive - Data pipeline version 1 (This is basic data warehouse)
  • Module 06 Hive - Data pipeline upgrade (build on top of previous case)

Data flow language for Big Data

Pig was initially developed to allow people using Hadoop to focus more on analyzing large data sets and spend less time having to write mapper and reducer programs. The Pig programming language is designed to handle any kind of data. Pig has two components - the first is the language itself, which is called PigLatin and the second is a runtime environment where PigLatin programs are executed.

  • Module 01. Pig - Basic Concepts and comparison with Hive
  • Module 02. Pig - Programming language
  • Module 03 Pig - Programming language (Continuation)
  • Module 04 Pig - Reading date from Hive Tables
  • Module 05 Pig - Ad Hoc data analytics with Pig
  • Module 05 Pig - Re-implementing Data pipeline using pig

Hands-On Lab Experience

Students will work through the following lab exercises using the Cloudera CDH4 Platform:


HDFS Command Line

  • How to navigate in the distribute files system.
  • View partial contents of files in the cluster.
  • Load files into the cluster.

Pig

  • Parse log files into data sets.
  • Write a program to extract fields from complex text and calculate stats.
  • Prepare a program to be scheduled to run daily via cron.

Hive

  • Create and alter Hive tables.
  • See the schema of a hive table.
  • Load data into a table

Join tables

  • Create a program to read text, parse and load data.
  • Create a program to be scheduled by cron to load data

Map/Reduce

  • Write a map/reduce program in python to calculate stats on text corpus.
  • Write a map/reduce program to list the top 20 words in a text input.

Course Level: Intermediate

Prerequisites:

Basic computer skills, basic knowledge in programming

Objectives:

At the completion of the course students will be enabled to perform the following:

  • Understand Core concepts, Hadoop Clusters and tools
  • Learn best practices for building Hadoop Solutions
  • Write MapReduce program
  • Develop programs using Hive and Pig
  • Design Data Pipelines


Beyond REST Building Reliable Distributed Applications


When we were building applications on the desktop, our network demands were r elatively simple. But as applications grow in complexity and move on to mobile, we'll need better strategies for dealing with network slowdowns, interruptions, and hostile intermediaries (e.g. proxies).
In this hands-on workshop, we'll look at the basic causes of trouble on the network and how they affect our applications. We'll look at best practices for building fault-tolerant applications across the network, then practice building clients and servers that follow these recommendations.


Course Level: Advanced

Prerequisites:

You should be comfortable writing complete Javascript applications on the client (browser) and the server (NodeJS); this calls for intermediate to advanced Javascript skills. You will need to have a machine where you can develop both and have node installed and running.
You should also have familiarity with managing Linux systems via the command line: logging in using SSH with a local private key, uploading via ftp/sftp, editing files (vi), navigating directories, and running scripts.

Objectives:

  • Identify the factors that could affect your application's connectivity.
  • Identify which communications strategy makes the most sense for your application (e.g. request-response vs. asynchronous messaging vs. streaming)
  • Plan and implement a recovery policy for your application
  • Explore building reliability on top of HTTP and WebSocket
  • Experiment with applications under conditions of degraded reliability