This post is not about where the ideas come from or even about how to convert what already exists into user stories. The focus here is about how to conduct that initial refinement meeting, then what might be different in subsequent refinement sessions.
These items are a combination of tech business news, development news and programming tools and techniques.
Here, in no particular order, are the top 9 reasons I like working with Go: The Toolchain, Clean Code, Go Routines, Channels, Metaphysical Parsimony, and 4 more...
Agile Practitioners 2015 is starting its way, and the first step is the Call For Papers! The Agile Practitioners conference started 4 years ago, and is an actual effort by the community. I’m proud to be part of the organizing committee, after presenting at the last 3 gatherings.
If you’ve been paying attention to agile at all, you’ve heard these terms: pairing and swarming. But what do they mean? What’s the difference?
If you missed anything on DZone this week, now's your chance to catch up! This week's best include a Spring MVC 3 view controller example, a look at the new mobile database, Realm, the Java origins of AngularJS, 5 quick points about threads in Java EE, and more.
Interruptions seem a central part of modern working life. Whether it’s open plan offices or the influx of digital tools into the workplace, a distraction never seems far away
When modelling data with ARIMA models, it is sometimes useful to plot the inverse characteristic roots. The following functions will compute and plot the inverse roots for any fitted ARIMA model (including seasonal models).
Certainly, Big Data applications are distributed largely because the size of the data on which computations are executed warrants more than a typical application can handle. But scaling the network that provides connectivity between Big Data nodes is not just about creating massive interconnects.
Previously, I was precalculating the completion which maximized the probability of the word, using some basterdized half-remembered version of bayes law. But I think there is a better approach, by running simulations on existing data.
Every week here and in our newsletter, we feature a new developer/blogger from the DZone community. This week we're talking to Pierre-Hugues Charbonneau, Senior IT Consultant and System Architect at CGI Inc., and author of DZone's 200th Refcard: Java Performance Optimization.
//[C# Code Sample]
Dictionary<string, Symbology> collection = new Dictionary<string, Symbology>();
collection.Add("Process Collection", Symbology.DataMatrix);
collection.Add("Dictionary Collection", Symbology.QR);
collection.Add("Aztec BarCode", Symbology.Aztec);
List<Bitmap> images = new List<Bitmap>();
foreach (KeyValuePair<string, Symbology> pair in collection)
using (BarCodeBuilder builder = new BarCodeBuilder())
builder.CodeText = pair.Key;
builder.SymbologyType = pair.Value;
int maxWidth = int.MinValue;
int sumHeight = 0;
foreach (Bitmap bmp in images)
sumHeight += bmp.Height;
if (maxWidth < bmp.Width)
maxWidth = bmp.Width;
const int offset = 10;
Bitmap resultBitmap = new Bitmap(maxWidth + offset * 2, sumHeight + offset * images.Count);
using (Graphics g = Graphics.FromImage(resultBitmap))
int yPosition = offset;
for (int i = 0; i < images.Count; ++i)
Bitmap currentBitmap = images[i];
g.DrawImage(currentBitmap, offset, yPosition);
yPosition += currentBitmap.Height + offset;
//[VB.NET Code Sample]
Dim collection As New Dictionary(Of String, Symbology)()
collection.Add("Process Collection", Symbology.DataMatrix)
collection.Add("Dictionary Collection", Symbology.QR)
collection.Add("Aztec BarCode", Symbology.Aztec)
Dim images As New List(Of Bitmap)()
For Each pair As KeyValuePair(Of String, Symbology) In collection
Using builder As New BarCodeBuilder()
builder.CodeText = pair.Key
builder.SymbologyType = pair.Value
Dim maxWidth As Integer = Integer.MinValue
Dim sumHeight As Integer = 0
For Each bmp As Bitmap In images
sumHeight += bmp.Height
If maxWidth < bmp.Width Then
maxWidth = bmp.Width
Const offset As Integer = 10
Dim resultBitmap As New Bitmap(maxWidth + offset * 2, sumHeight + offset * images.Count)
Using g As Graphics = Graphics.FromImage(resultBitmap)
Dim yPosition As Integer = offset
For i As Integer = 0 To images.Count - 1
Dim currentBitmap As Bitmap = images(i)
g.DrawImage(currentBitmap, offset, yPosition)
yPosition += currentBitmap.Height + offset
So you can see where I'm heading with the question posed at the title of this post. It is the reference resolution that is fundamental an indirection is simply what we do to (re)define what that resolution process ultimately looks like.
Since Scrum is an iterative process, you can have great success by focusing on things that you can say with a fair degree of confidence, while still allowing for some uncertainty in all planning and estimates.
We got a customer question about a map/reduce index that produced the wrong results. The problem was a problem between the conceptual model and the actual model of how Map/Reduce actually works.
When you are making GitHub commits you have to provide a story that explains the changes you are committing to a repository. Many of us just post 'blah blah’, ‘what I said last time", or any other garbage that just gets us through the moment. You know you’ve all done it at some point.
Almost every New Product Introduction process includes some placeholder slots to talk about risk. Everyone knows enough to put mitigation plans in these slots. But those mitigation plans are drafted and then ignored by the vast majority of managers.
-- Relative time with interval
DATE_SUB(NOW(), INTERVAL 25 HOUR)
-- Select all users that were updated in the last 24 hours.
SELECT * FROM users
WHERE users.updated > DATE_SUB(NOW(), INTERVAL 24 HOUR);
-- Select all users that were updated in the last 7 days.
SELECT * FROM users
WHERE users.updated > DATE_SUB(NOW(), INTERVAL 7 DAY);
Whenever a new networking platform is evaluated, one of the early sales calls includes a packet walkthrough. But why?
Econometrics is often “theory driven” while statistics tends to be “data driven”. I discovered this in the interview for my current job when someone criticized my research for being “data driven” and asked me to respond.