Israel, in prosecuting its war in Gaza, has relied on an AI system named "Lavender".
Lavender uses data on known Hamas affiliates to direct strikes on them. At its peak, the system identified 37,000 men as being such, though it was so overly broad as to include police officers and people involved in civil defense. It was eventually scaled back to entail a narrower set which only includes Hamas fighters, leaders, etc.
Imagine you're a low-ranking Hamas fighter. Lavender tells the Israeli air force where your home is, and they decide to drop a bomb on it. You have family members living with you; the system determines that no more than 10-15 people would be killed in this strike to kill you, so it approves the operation. Of course, if you were high-ranking, a much higher death toll could be justified. Ironically the opposite often turns out to be true, since imprecise but cheap "dumb bombs" are used on low-priority targets, whereas more expensive "smart bombs" are used to fry bigger fish. Imprecise bombs run a higher risk of collateral damage.
Hamas-controlled organizations in Gaza estimate that over 33,000 Gazans have been killed thus far. Hypothetically, 30,000 civilians could've been killed in the process of neutralizing 3,000 average Joes who are fighting for the group.
It's true, then, that Israel is killing the families of terrorists as part of its official policies.