Tuesday, November 29, 2011
Friday, November 04, 2011
This is the original video that has triggered this conversation.
the details can be found here http://www.wisetechglobal.com/events.html
Hope to see you there.
In this session Dr. Neil Roodyn (http://www.roodyn.com) will discuss how scenarios, that a few years ago would have been science fiction, are now becoming achievable with the technology we use every day. Taking some small leaps and pulling together technology from different platforms, enables us and our clients to now achieve tasks that would have seemed to be magical a mere 5 years ago.
In this session you will learn how some key technologies are likely to become the core components of experiences that seamlessly connect devices, people and software together.
Saturday, July 30, 2011
There are two important parts of that last statement:
1. lightweight methods
2. software shipped
The concept of doing the simplest thing to achieve a goal, is an important part of eXtreme Programming. Traveling light (or not carrying baggage from previous experiences) is an important part of being agile. One of the aspects of agile that I always found attractive was the objective of enabling the development team to deal with change in short time frames.
While it may not be fully apparent, from the behaviour of many software development teams, the objective of developing software is to ship a product, a finished piece of working software. This desire to find better ways to get software shipped is clearly not something everyone in the industry shares. that is why we have collections of rules that appear to do nothing than keep people in jobs. Don't get me wrong, I appreciate that in certain situations we need 5 people analyzing the business rules in order to determine how to build software that manages a complex business process. The reality I often see is that the business process is not that complex and the reason the team has 5 business analysts is because the career path for developers in that company is to be promoted to a business analyst role.
The reason for this post is to try and make you think about how the actions you are taking are really making you more agile (small 'a'). If your objective is to be able to handle change with as little pain as possible then following a hard set of inflexible rules is not going to help you too much.
Shipping software is a art that is not easy to teach by laying down a set of rules, in many ways it is much more like a creative activity than an engineering activity. Decisions have to be made that are not pure engineering decisions, they are not pure business decisions, nor are the decisions purely design oriented. It is a combination of all of these things and more.
Questions that need to be answered include; is the software aesthetically pleasing to the user, is the software functional, is the time right to release this to the market, etc..
There seems to be this constant flow of software with rules designed to help people become Agile, there are also more and more people professing to have the ultimate rule book for Agile software development.
In my opinion this is 90% bogus, the number of people that have a career telling other people how to ship software and have not actually shipped software in years (or ever!) amazes me.
If you want to be truly agile then drop as many tools as you can, learn to do more with less rules and restrictions, and most of all practice shipping software by actually shipping software!
Friday, July 15, 2011
In this episode of CodeCast, Ken Levy interviews Dr. Neil Roodyn discussion the differences and scenarios around cloud and client computing. Topics includes applications running in the cloud, documents hosted in the cloud, and rich/smart clients that use the cloud as part of the solution, with more and more of a mixture of client and cloud within applications rather than one versus the other.
Saturday, April 09, 2011
In the last month while in Sydney I have been to the Hotel, Hospitality and Design expo and the EduCause expo, both in Darling Harbour at the Sydney exhibition and convention center.
The nsquared team in Sydney has been working on optimizing the nsquared education pack for Windows 7 touch. This has been done on a range of devices; HP Touchsmart desktops, Tegatech TEGAV2 and the 3M 22 Inch touch screen.
The Managed Chatrooms website and service has been revamped and updated. Microsoft is among our customers of Managed Chatrooms, and our customers have stated using Managed Chatrooms for more events this month than ever before.
You may have noticed that Apple released a new product in March and the iPad app I wrote to help manage tasks on Exchange and SharePoint, ntask, has had an update to work better with Exchange 2003. Also on the iPad nsquared released coin swap, a game to teach children the value of money.
Monday, March 28, 2011
Telstra bought Microsoft Surface units to run the nsquared business pack and some custom software that the nsquared team built specifically to engage the Telstra small business customers.
Wednesday, March 23, 2011
I have recently been involved in a number of conversations regarding breadth marketing, essentially this means going for more customers over a wider range of vertical markets. On the other hand depth marketing means targeting a small number of (often already known) customers and spending more time making sure they are incredibly happy with your product.
Given a limited amount of resources (every company in the world has its limits, just some are bigger limits than others) the breadth approach means a lower level of engagement per customer than a depth approach.
The question in my mind is how does the approach taken to marketing impact on the perceived value of the brand.
and can making the right choice of approach combined with a price point help a brand become more successful.
The question about perceived value seems like it might be a no-brainer. Yet the question might be better asked this way; in order to be successful does the marketing approach effect the perceived value or does the product value determine the marketing approach ?
Surely an approach that encourages a deeper level of customer engagement will increase the perceived value. Take for example the custom built super car industry. Here the customer is cherished and their hand is held through each step of making choices about the car, including fittings and extras. This is clearly a high value product and nearly always perceived as high value brand; you don’t go to Citroen in order to get a customer super car built, you go to Ferrari or Lamborghini. Yet we could flip this on its head and say that because this is such a high value product it dictates that the customers be given a better service. If you look at it this way it would then suggest that the higher the value of the product the deeper you would want your customer engagement. I also wonder if deciding to position your product for a breadth marketing campaign actually lowers the perceived value of your product. The more people that use your product the less exclusive they feel.
Some products are clearly well positioned for breadth-marketing examples include; soft drinks such as Coke or Pepsi, consumer electronics such as iPods or televisions, and mainstream movies such as Spiderman or Harry Potter. These are all low value products and have mass-market appeal, or so their producers would like us to believe.
Lets focus on technology for a moment and consider the entire experience, not just hardware and not just software, but the end-to-end experience. Clearly Apple has a lot of passion in this space, they have built a strong brand on the user experience of their products. Yet the Apple products are themselves not that expensive and are certainly targeting the mass-market space. Apple is clearly going for breadth marketing, and it seems to be working out pretty well for them. Microsoft has done well in this space with Xbox and recently incredibly well with Kinect, again a low value, consumer focused device with mass-market appeal.
So why all this pontificating?
Recently I have come across a few products that are clearly high value propositions, not targeting consumers but rather specific verticals and small numbers of clients (in the big scheme of things) and yet they are taking a breadth marketing approach. They are twittering like crazy, putting their products to be touched by consumers that will never be able to afford the products and trying to drive what appears to be a consumer facing marketing campaign for a non-consumer device. To me this seems like a mistake and yet maybe they will prove me wrong.
Time will tell.
Thursday, March 17, 2011
Wednesday, March 09, 2011
Step back 30 years (1981) to the days when the mainframe was being positioned as a computing solution for business via thin client terminals. At the same time a new type of computer was starting to become popular, the personal computer. The personal computer was a low powered device by comparison to the mainframe, but it was all yours,.There was no sharing of resources or concerns of your personal and private data getting into the wrong person's hands. Microsoft made it their mission to put a computer on every desktop, it was a noble cause, a democratization of computing power. As personal computing power increased and with it the software for personal computers, the dream has become a reality. Now the modern day personal computer is often more powerful than many of the mainframe devices. The personal computer has also become so cheap as to be considered a disposable item by large companies. Place a large number of personal computers together and you can create an immensely powerful shared computing resource. In many ways, this stacking together of large numbers of computers provides more redundancy than a single large powerful computer. When a part of it fails it is really disposable and can be left as a dead component in the array of computers. The operating systems that these personal computers run is not considerably different from the operating system that runs the modern day laptop or the desktop computer.This has lead to a shift in the dynamics of the industry. The manufacturers of the personal computer operating system (let's face it, I am talking about Microsoft here in the main), is now the same company that is producing the operating system for the shared resource computing accessed through the modern day terminal (the browser). This now means the same company that has lead the charge to bring computers to the desk and into your hands is motivated to create large online shared resource centers, centers that run their operating system.
The interesting thing is that over the last 20 years Microsoft's Windows operating system has matured into a far superior server technology than client technology. Consider the Vista issues that clients reported and compare that to the lack of issues reported for Windows Server 2008. Windows Server 2008 is a really solid operating system and yet at its core is the same operating system as Windows Vista. It should not be surprising then that a company like Microsoft would play to its strengths and be motivated to sell more of its server operating systems. Windows 7 has done a lot to regain credibility for Microsoft as an operating system on the personal computer yet it is clear that market share is being lost to other products.
Many of these other products are no longer the traditional 'personal computer' and Steve Jobs in his announcement of the iPad2 referred to these computers as 'post PC' devices. Clearly your phone, slate computer or TV are not quite the same as your personal computer, yet these devices all harness the same (if not more) computing power as the personal computer of 5 years ago.
These post pc devices are often designed for richer media consumption such as movies, video chat, music and games. In order to optimize the experience for the consumer the software running on the devices is getting increasingly complex.
This clear movement to richer and richer experiences on client devices leads to a higher and higher consumption of bandwidth. At the same time greater levels of abstraction are being created by developers to simplify their lives. For example consider the heavy use of XML as a data transfer medium, XML is human readable and easy to debug, yet it increases the bits required to send the information compared to other methods software developers used when the available bandwidth was far more constrained. Couple this increased use of bandwidth with, what seems like, an insatiable appetite for richer experiences and we have a scenario where more power is being required on both the server and the client.
The cloud or shared computing resource is here. It has been here for a long time and will be here for a long time yet. Creating a new name for this massive shared computing resource may help with marketing it as the new thing, but ultimately it is nothing more than a huge online computing platform that is shared and scalable. I am sure I remember a salesman from IBM telling me this about the IBM offering in 1986!
The smartphone, tablet, tv and laptop will always provide a better experience when the application is written to target that device specifically. The web provides a lowest common denominator for interfaces that might be interesting for businesses that want to maximize their reach at the cost of user experience. The most enjoyable experiences will always come from applications designed to take advantage of the platform upon which they are running.
I think the argument of cloud vs client is not a real decision that needs to be made. To create an optimal user experience requires a rich client application. To make the experience seamless across the different devices in our pockets and on our desks requires shared storage that is reliable, scalable and can be accessed from (almost) anywhere. It is not about cloud or client, it is about cloud AND client.
Tuesday, February 15, 2011
I recently had a chat with Ken on CodeCast about Surface 2.0, Kinect and a bunch of other Microsoft technologies.
It has been difficult in the last few months to discuss what we have been working on and there are still areas that we cannot discuss. Yet the cover has been lifted on the surface 2.0 project and I am proud to have been involved in the new software development that the team at nsquared has been creating.
Grab your headphones and listen to this easy going chat between Ken and myself.
Monday, January 31, 2011
I proposed a total of nine talks for the event, sadly only three were accepted for the community voting. If you would like to attend any of these presentations please will you click on the link in the title and vote for them.
Wave, Touch, Pen, Speech, Mouse and Keyboard
In the last decade we have seen a variety of new interfaces popularized. With Microsoft Kinect you are the controller. There are screens that can see like Microsoft Surface. We have touch screens that can feel you and pen interfaces that provide rich digital inking capabilities. Speech technology to control an computer has existed for over a decade now. Yet we still are using the mouse and keyboard almost daily.In this session you will explore how the different input technologies can be applied to different categories of engagement. You will learn why the mouse and keyboard is here to stay and when you should take advantage of the other input technologies. This session will also provide you with some insight into how you can apply combinations of input to enhance your applications further.
How to build a great Microsoft Surface application
Microsoft Surface represents a new category in computing. Vision based screens enable unique interactions and they present fantastic opportunities for innovative software to be created. In this session you will learn what makes Microsoft Surface unique and how you can use that to build great software for Microsoft Surface. This session will cover the user interfaces and concepts that you need to apply in order to take advantage of the technology in MicrosoftSurface. With the imminent release of Microsoft Surface 2.0 this session will cover everything you need to build really amazing experiences for MicrosoftSurface.The company Dr.Neil works for has more applications certified for Microsoft surface than any other company in the world. This session will provide some insight into how they conjure up the magic that enables them to repeatedly build awesome Surface experiences.
Building Really Social Software
Technology can be both an inhibitor and an enabler of social engagement. This session presents a discussion on how technology can be used to enrich the dialogue between users. When you consider many forms of computing today you think of users staring into a screen and yet the most successful systems, such as twitter and Facebook, are really about how people converse with each other. In the last few years new categories of technology, such as Microsoft Surface and Kinect, have emerged that truly bring people together. This session will discuss the way these new technologies (and others) will change the way we can use technology to enhance human interactions.
Then with the other Surface MVPs we are proposing a discussion on all things relating to Natural User Interfaces.
The Microsoft Surface MVPs present: Natural User Interfaces, Today and Tomorrow; an interactive discussion and demonstration
Joshua Blake; Neil Roodyn; Dennis Vroegop; Rick Barraza; Bart Roozendaal; Josh Santangelo; Nicolas Calvi
The Natural User Interface (NUI) is a hot topic that generates a lot of excitement, but there are only a handful of companies doing real innovation with NUIs and most of the practical experience in the NUI style of design and development is limited to a small number of experts. The Microsoft Surface MVPs are a subset of these experts that have extensive real-world experience with Microsoft Surface and other NUI devices.This session is a panel featuring the Microsoft Surface MVPs and an unfiltered discussion with each other and the audience about the state of the art in NUI design and development. We will share our experiences and ideas, discuss what we think NUI will look like in the near future, and back up our statements with cutting-edge demonstrations prepared by the panelists involving combinations of Microsoft Surface 2.0, Kinect, and Windows Phone 7.
Please submit your vote for the sessions you would like to attend or hear. Remember many of the sessions get recorded and published online after the event.
The presentations that got rejected were:
Confuse me, lose me
Lets face it no one reads the manual anymore. Software should be so easy to use that the user can just walk up and start using it. So why is this not the case?
In this session Dr. Neil will discuss how interfaces can be built to enable users to get started with new technology and learn by using the system. A rich discussion of the pitfalls of complexity and how to simplify your user interface will allow you to leave this session with a set of tips to make your applications easier to use.
This session will provide an insight in to the thinking behind Natural User Interfaces as well as how to improve your Graphical User Interfaces.
What will the furniture of the future look like? What functionality will it provide? This session provides you with a fun filled look into the way technology is being built into furniture and how furniture may look in our meeting rooms, our office and our homes in 10 years time.
Microsoft Surface has provided a glimpse into this future of furniture, where a table is no longer just an inert object but provides rich digital content. How else might our future change with new innovations that we expect to see emerge in the coming decade.
Look into my eyes
Face to face meetings provide the best way to really understand another person. Human interaction happens between people. Business transactions are agreements between people. It is easy to forget that everything we do in the software industry is still about people. In this session you will discover how technologies like Surface 2.0 can enrich the true social interactions between people. You will also learn why it is important for software developers to understand the nature of human engagement in order to build better
Now you’re speaking my language
Do you want to support multiple languages in your software? Can’t afford to pay for a translator to translate your software into every language? Microsoft Translator will help you support multiple languages in your application.
At Mix 09 Microsoft announced the release of the Microsoft Translator SDK. At Mix 10 Microsoft announced an update to the Microsoft Translator SDK include the Community Translation Framework. Come to this session to find out the new technology being released by the Microsoft Translator team.
Microsoft Translator is a team within Microsoft Research that focuses on natural language processing. The Microsoft Translator technology is used by Bing Translator, in Internet Explorer and in Office 2010.
So what exactly is technology anyway?
How are you reading this session abstract? Is it created with technology? Is the attendee badge you are wearing technology? One of the definitions of advanced technology is you will not notice it is there. It will blend into everyday life. With this concept in mind, Dr. Neil will discuss how we can move the technology we are building to the next level and become unnoticed by our users.
Come to this session to be awakened by the reality of what we are all creating in this world and learn how we can extend the reach of our technology into the everyday lives of our users.
If you would like to hear one of these presentations, please get in touch.
Tuesday, January 18, 2011
It is a gorgeous task management application that synchronizes with Microsoft Exchange or Microsoft Sharepoint tasks lists. I have been using it while it has been developed and I am so pleased to see it has been selling like hotcakes now it is available in the App Store.
Earlier today when I checked it was the 32nd best selling business application in the app store! Congratulations to the nsquared team for another wonderful application.
Tuesday, January 04, 2011
This is something that I have discussed and presented on many times before. In fact in 2003 I presented proof of concept applications on the Microsoft smartphone platform ( then it was actually called Microsoft Smartphone) and the Microsoft Tablet PC platform. These applications enabled the user to post their current location to a blog or shared feed. I have used a number of variations of this to keep my 'Where was Dr. Neil' page updated.
In the last couple of years the rise in popularity of Facebook and Twitter have lead to an increase of related location sharing services. Services like foursquare or Facebook's own checkin system enable you to share your location with the world. This is great for those occasions where you want everyone to know where you are. These systems are mostly an all or nothing solution, either share with everyone or no one. So when you just want to let a couple of people know where you are located it's is back to the old text message or even resorting to call them.
It is for this reason we built Locus. Locus enables you share your location when you chose and with whoever you chose, including Facebook and Twitter.
I truly believe the pendulum will start to swing the other way now, with private personal information becoming more valuable. This value will start to be realized by the consumer and leveraged by the consumer to their own advantage.
When will you start to take more care of the information you share and who you share it with?