A quick way to scan a bunch of WordPress installs for the Docs virus. Not sure of the real name since I can't find a reference to it on the net.
dir /s /b *.php | find "docs.php"
A more thorough search is:
findstr /sim /c:"chr(base_convert(substr" *.php
This looks for the actual code that decodes the virus.
Either method could produce some false positives so apply some comment sense before deleting files.
Removing the virus consists of checking the first line in the function.php file for each theme and removing the junk after <?php
Then delete the Docs plugin.
You will need to do this with your site disabled as the plugin will reinfect the functions.php files and the functions.php extras will reinstall the plugin.
I haven't spent too much time trying to figure out what it does exactly. However, this is what I do know.
The code added to the theme functions.php
First it checks to see if the wp-content/plugins/docs/docs.php file exists and if not it downloads a copy from http://lamulata.biz/cript_doc.php
Next if the docs.php file exists it checks to see if it is activated. If not it activates it.
Registers a daily cron job which is used to update itself.
Uses the docs_wp_plugin_active_list_update action to hide itsoft from the plugin list.
Hooks the wp_head action which seems to send some data about the current request to http://22.214.171.124/index.php
Ultimately it seems to fetch some content from a remote site and includes it in the page content or completely replaces the content.
One of the side effects is that it looks for a writeable tmp directory and caches data there. I found over 25,000 files in one cache folder.
I recently had to remove a StealRat infection from a computer. Unfortunately most of the available information is out of date and only helped somewhat.
Once I found the actual file that was the issue I developed a better command to detect any other infections:
findstr /sim /c:"'](NULL)" *.php
While researching a topic for my book I cam across the original paper that helped spawn the concept of Yesterday's Weather. That is the probability that amount of work you will do next week is highly likely to be the same as the amount of work you did last week.
In this study, the authors used 111 time series to examine the accuracy of various forecasting methods, particularly time-series methods. The study shows, at least for time series, why some methods achieve greater accuracy than others for different types of data. The authors offer some explanation of the seemingly conflicting conclusions of past empirical research on the accuracy of forecasting. One novel contribution of the paper is the development of regression equations expressing accuracy as a function of factors such as randomness, seasonality, trend-cycle and the number of data points describing the series. Surprisingly, the study shows that for these 111 series simpler methods perform well in comparison to the more complex and statistically sophisticated ARMA models.
Accuracy of Forecasting: An Empirical Investigation, Spyros Makridakis, Michele Hibon and Claus Moser, Journal of the Royal Statistical Society. Series A (General), Vol. 142, No. 2 (1979), pp. 97-145.
Here is the bike with the tire removed. No big surprises here except the shop manual doesn't say what size the axle nut is (30mm) and I had to go buy one.
Tom DeMarco, arguably one of the key thinkers when it comes to how we develop software has been reflecting.
My early metrics book, Controlling Software Projects: Management, Measurement, and Estimation (Prentice Hall/Yourdon Press, 1982), played a role in the way many budding software engineers quantified work and planned their projects. In my reflective mood, I’m wondering, was its advice correct at the time, is it still relevant, and do I still believe that metrics are a must for any successful software development effort? My answers are no, no, and no.
If that doesn't rock you back on your heels, then you need to re-read that paragraph.
Next you need to go read the whole article (2 pages).
As someone who prefers the agile approach I have been pushing the value based approach over the control based one for nearly a decade now. But to see someone like Tom question publicly what he (and we) have been doing for the last 30 years makes me respect Tom even more, and give me hope that as an industry we are heading in the right direction.
Jack was of the opinion that:
the more one spends time tracking metrics, the less time there is for development
While I have some sympathy for this point of view having worked for larger organizations in the past, I have come to realize that you do need some type of metric that is understandable to the rest of the organization. All the other departments in your organization have an overriding single number that describes how they are doing, why not software development?
As I mentioned in my No More Iterations post, throughput is my metric of choice. The cost of collecting this metric is so low that it doesn't matter.
Now I have been asked to provide all sorts of low level metrics in the past not knowing how they were going to be used. I was not inclined to cooperate in those cases since the time required to collect them was never going to be offset by any value coming back to my teams. And this is most likely what Jack is protesting.
I like being proactive and providing a metric I think is useful, rather than waiting for someone who doesn't really understand software development ask me to have my teams track actual effort against estimated effort in units of 0.1 hours (really I have been asked to provide this!).
:: Next >>