From: Joona Kiiski Date: Thu, 9 Apr 2009 04:29:13 +0000 (+0300) Subject: Razor again at depth one X-Git-Url: https://git.sesse.net/?p=stockfish;a=commitdiff_plain;h=d20e0cf048c7e901f422f61259cd2fbfdba0b209;hp=342ceb1c9124ced53650d6c422a8cfa32e0b12ef Razor again at depth one Some time ago it was found by Marco Costalba that it's better to disable razoring at depth one, because given the very low evaluation of the node, futility pruning would already do the job at very low cost and avoiding missing important moves. Now enable razoring there again, but only when our quickly evaluated material advantage is more than a rook. The idea is to try razoring only when it's extremely likely that it will succeed. Extreme lightning speed test show promising result: Orig - Mod: +1285 =1495 -1348 This needs to be tested with longer time controls though. Signed-off-by: Marco Costalba --- diff --git a/src/search.cpp b/src/search.cpp index a3912603..5ccc72c8 100644 --- a/src/search.cpp +++ b/src/search.cpp @@ -193,8 +193,8 @@ namespace { //remaining depth: 1 ply 1.5 ply 2 ply 2.5 ply 3 ply 3.5 ply const Value RazorMargins[6] = { Value(0x180), Value(0x300), Value(0x300), Value(0x3C0), Value(0x3C0), Value(0x3C0) }; - //remaining depth: 1 ply 1.5 ply 2 ply 2.5 ply 3 ply 3.5 ply - const Value RazorApprMargins[6] = { Value(0x100000), Value(0x300), Value(0x300), Value(0x300), Value(0x300), Value(0x300) }; + //remaining depth: 1 ply 1.5 ply 2 ply 2.5 ply 3 ply 3.5 ply + const Value RazorApprMargins[6] = { Value(0x520), Value(0x300), Value(0x300), Value(0x300), Value(0x300), Value(0x300) }; // Last seconds noise filtering (LSN) bool UseLSNFiltering;