Previously, I was precalculating the completion which maximized the probability of the word, using some basterdized half-remembered version of bayes law. But I think there is a better approach, by running simulations on existing data.
Every week here and in our newsletter, we feature a new developer/blogger from the DZone community. This week we're talking to Pierre-Hugues Charbonneau, Senior IT Consultant and System Architect at CGI Inc., and author of DZone's 200th Refcard: Java Performance Optimization.
//[C# Code Sample]
Dictionary<string, Symbology> collection = new Dictionary<string, Symbology>();
collection.Add("Process Collection", Symbology.DataMatrix);
collection.Add("Dictionary Collection", Symbology.QR);
collection.Add("Aztec BarCode", Symbology.Aztec);
List<Bitmap> images = new List<Bitmap>();
foreach (KeyValuePair<string, Symbology> pair in collection)
using (BarCodeBuilder builder = new BarCodeBuilder())
builder.CodeText = pair.Key;
builder.SymbologyType = pair.Value;
int maxWidth = int.MinValue;
int sumHeight = 0;
foreach (Bitmap bmp in images)
sumHeight += bmp.Height;
if (maxWidth < bmp.Width)
maxWidth = bmp.Width;
const int offset = 10;
Bitmap resultBitmap = new Bitmap(maxWidth + offset * 2, sumHeight + offset * images.Count);
using (Graphics g = Graphics.FromImage(resultBitmap))
int yPosition = offset;
for (int i = 0; i < images.Count; ++i)
Bitmap currentBitmap = images[i];
g.DrawImage(currentBitmap, offset, yPosition);
yPosition += currentBitmap.Height + offset;
//[VB.NET Code Sample]
Dim collection As New Dictionary(Of String, Symbology)()
collection.Add("Process Collection", Symbology.DataMatrix)
collection.Add("Dictionary Collection", Symbology.QR)
collection.Add("Aztec BarCode", Symbology.Aztec)
Dim images As New List(Of Bitmap)()
For Each pair As KeyValuePair(Of String, Symbology) In collection
Using builder As New BarCodeBuilder()
builder.CodeText = pair.Key
builder.SymbologyType = pair.Value
Dim maxWidth As Integer = Integer.MinValue
Dim sumHeight As Integer = 0
For Each bmp As Bitmap In images
sumHeight += bmp.Height
If maxWidth < bmp.Width Then
maxWidth = bmp.Width
Const offset As Integer = 10
Dim resultBitmap As New Bitmap(maxWidth + offset * 2, sumHeight + offset * images.Count)
Using g As Graphics = Graphics.FromImage(resultBitmap)
Dim yPosition As Integer = offset
For i As Integer = 0 To images.Count - 1
Dim currentBitmap As Bitmap = images(i)
g.DrawImage(currentBitmap, offset, yPosition)
yPosition += currentBitmap.Height + offset
So you can see where I'm heading with the question posed at the title of this post. It is the reference resolution that is fundamental an indirection is simply what we do to (re)define what that resolution process ultimately looks like.
Since Scrum is an iterative process, you can have great success by focusing on things that you can say with a fair degree of confidence, while still allowing for some uncertainty in all planning and estimates.
We got a customer question about a map/reduce index that produced the wrong results. The problem was a problem between the conceptual model and the actual model of how Map/Reduce actually works.
Almost every New Product Introduction process includes some placeholder slots to talk about risk. Everyone knows enough to put mitigation plans in these slots. But those mitigation plans are drafted and then ignored by the vast majority of managers.
When you are making GitHub commits you have to provide a story that explains the changes you are committing to a repository. Many of us just post 'blah blah’, ‘what I said last time", or any other garbage that just gets us through the moment. You know you’ve all done it at some point.
-- Relative time with interval
DATE_SUB(NOW(), INTERVAL 25 HOUR)
-- Select all users that were updated in the last 24 hours.
SELECT * FROM users
WHERE users.updated > DATE_SUB(NOW(), INTERVAL 24 HOUR);
-- Select all users that were updated in the last 7 days.
SELECT * FROM users
WHERE users.updated > DATE_SUB(NOW(), INTERVAL 7 DAY);
Econometrics is often “theory driven” while statistics tends to be “data driven”. I discovered this in the interview for my current job when someone criticized my research for being “data driven” and asked me to respond.
Whenever a new networking platform is evaluated, one of the early sales calls includes a packet walkthrough. But why?
So, where did our new event fall on a scale compared to the others? Did I feel like attendees received value and did I feel like we got a good return on our investment? Let me give a little comparison.
Today, DZone released Refcard #200: Java Performance Optimization. To mark the significance of this milestone, this Refcard boasts a complete redesign - all the information you expect from a Refcard in a shiny new package!
It's dangerous to link to lines or blocks of code on Github without explicitly specifying the commit hash in the URL. This emits the Github URL to the HEAD commit on the current branch, specifying the commit hash in the URL
The administrators on the server noted the high I/O and that a single thread was constantly busy and decided that this is likely a hung thread. The long term fix was to actually make sure that we abort the operation after a while, report to the remote server that we scanned up to a point, and had nothing to show for it, and go back to the replication loop.
I thought it’d be interesting to create some visualisations around the times that people RSVP ‘yes’ to the various Neo4j events that we run in London. I tried to use ggplot to create a bar chart of the data. Unfortunately that resulted in this error:
I have created a Minimum Reading List for an Agile Transition. Note the emphasis on minimum. I could have added many more books to this list. But the problem I see is that people don’t read anything. They think they do agile if they say they do agile.
"Release!" is a card game about making software inspired by development strategies like Lean, Agile, and DevOps.
Proponents of the Waterfall Model suggest that by doing all the design up front and making sure that each part of the process is correct before moving onto the next part, means that bugs are found sooner and therefore costs are reduced.
According to Dr. Gary McGraw’s ground breaking work on software security, up to half of security mistakes are made in design rather than in coding. For the last 10 years we’ve been told that we are supposed to do this through threat modeling. What else can we do to include security in application design?