Mourners bury one of the last hostages released from Gaza as talks start for ceasefire future

posted in: All news | 0

By JULIA FRANKEL

JERUSALEM (AP) — Mourners in Israel on Friday were burying the remains of one of the final hostages released in the first phase of the ceasefire between Hamas and Israel, as negotiators discussed a second phase that could end the war in Gaza and see the remaining living captives returned home.

Related Articles

World News |


Watch: Trump and Vance call Zelenskyy ‘disrespectful’ in Oval Office meeting

World News |


Israel’s army admits failures on Oct. 7. Its probe of the attack could put pressure on Netanyahu

World News |


Pope had coughing fit, inhaled vomit and his prognosis remains guarded, Vatican says

World News |


Russia offers to restore direct air links with the US, during Istanbul talks

World News |


Zelenskyy leaves White House without signing minerals deal after Oval Office blow up

The funeral procession for Tsachi Idan, an avid soccer fan who was 49 when he was abducted by Hamas, began at a Tel Aviv football stadium en route to the cemetery where he was to be buried in a private ceremony.

The office of Israeli Prime Minister Benjamin Netanyahu has said Idan, taken from Kibbutz Nahal Oz during the Hamas-led Oct. 7 2023 attack that sparked the war in Gaza, was killed in captivity.

His body was one of four released by Hamas early Thursday in exchange for over 600 Palestinian prisoners, the last planned swap of the ceasefire’s first phase, which began in January. Hamas has been designated a terrorist organization by the United States, Canada, and European Union.

Idan was the only one of his family taken to Gaza. His eldest daughter, Maayan, was killed as terrorists shot through the door of their saferoom. Hamas fighters broadcast themselves on Facebook live holding the Idan family hostage in their home, as his two younger children pleaded with them to let them go.

“My brother is the real hero. He held on,” Idan’s sister, Noam Idan ben Ezra, said in an interview on Israeli radio Friday. She said Idan had been “a pace away” from being released during a brief ceasefire in November 2023, when more than 100 of the 251 people abducted on Oct. 7 were released.

People attend a public memorial ceremony for slain hostage Tsachi Idan, a fan of Hapoel Tel Aviv F.C., who was killed in Hamas captivity in the Gaza Strip, at Bloomfield Stadium in Tel Aviv, Israel, Friday, Feb. 28, 2025. Hebrew: “Tsachi Idan – red in the soul.” (AP Photo/Leo Correa)

“Tsachi was forsaken twice. The first time when he was kidnapped from his home and the second time when the deal blew up,” she added. “The fact that Tsachi is not standing next to me today is the outcome of the decision-making and the policy here in Israel. They did not listen to us then, but it’s not too late to listen to us today.”

Concern for remaining hostages

With the first phase of the ceasefire deal set to end Saturday, relatives of hostages still held in Gaza are ramping up pressure on Netanyahu to secure the release of their loved ones.

According to Israel, 32 of the 59 hostages still in Gaza are dead, and there has been growing concern about the welfare of an unknown number who are still alive, particularly after three hostages released Feb. 8 appeared emaciated.

One of the three, Eli Sharabi, said in an interview with Israel’s Channel 12 Friday that he and other hostages had been held in iron chains, starved and sometimes beaten or humiliated.

“During the first three days, my hands are tied behind my back, my legs are tied, with ropes that tear into your flesh, and a bit of food, a bit of water during the day,” he said, in one of the first interviews by a hostage released under the current deal. “I remember not being able to fall asleep because of the pain, the ropes are already digging into your flesh, and every movement makes you want to scream.”

Sharabi found out after his release that his wife and daughters had been killed during the Oct. 7 attack.

The next phase of the ceasefire

Under the terms of the truce Israel and Hamas agreed to, Phase 2 of the ceasefire is to involve negotiations on ending the war that has devastated the Gaza Strip. That includes the return of all remaining living hostages and the withdrawal of all Israeli troops from the Palestinian territory. The return of the bodies of the remaining deceased hostages would occur in Phase 3.

Hamas said in a statement released Friday that it “reaffirms its full commitment to implementing all terms of the agreement in all its stages and details.” It called on the international community to pressure Israel to “immediately proceed to the second phase without any delay or evasion.”

Officials from Israel, Qatar and the United States have started “intensive discussions” on the ceasefire’s second phase in Cairo, Egypt‘s state information service said Thursday. Netanyahu’s office confirmed he had sent a delegation to Cairo. Israel has reportedly been seeking an extension of the first phase to secure the release of additional hostages.

“The mediators are also discussing ways to enhance the delivery of humanitarian aid to the Gaza Strip, as part of efforts to alleviate the suffering of the population and support stability in the region,” said the statement from the prime minister’s office.

Israel’s negotiators will return home Friday night, said an Israeli official, speaking to The Associated Press on condition of anonymity to discuss the closed-door talks. Negotiations are set to continue Saturday, the official said. But it was not clear if the Israeli team would travel back to Cairo to attend them.

United Nations Secretary-General Antonio Guterres said Friday that the coming days are “critical,” and urged Israel and Hamas to fulfill their commitments.

The first phase of the ceasefire saw 33 hostages, including eight bodies, released in exchange for nearly 2,000 Palestinian prisoners. Netanyahu has vowed to return all the hostages and destroy the military and governing capabilities of Hamas, which remains in control of Gaza. The Trump administration has endorsed both goals.

But it’s unclear how Israel would destroy Hamas without resuming the war, and Hamas is unlikely to release the remaining hostages — its main bargaining chips — without a lasting ceasefire. After suffering heavy losses in the war, the group has nonetheless emerged intact, and says it will not give up its weapons.

The ceasefire, brokered by the United States, Egypt and Qatar, ended 15 months of war that erupted after Hamas’ 2023 attack on southern Israel that killed about 1,200 people.

Israel’s military offensive has killed more than 48,000 Palestinians, according to Palestinian health officials, who don’t differentiate between civilian and combatant deaths but say over half the dead have been women and children.

The fighting displaced an estimated 90% of Gaza’s population and decimated the territory’s infrastructure and health system.

Palestinians prepare for Ramadan amid destroyed homes

Palestinians who returned to destroyed homes in Gaza City started Friday to prepare for Ramadan, shopping for essential household goods and foods. Some say the Islamic holy month feels better than one spent last year, but still far from normal.

“The situation is very difficult for people and life is very hard. Most people — their homes have been destroyed. Some people can’t afford to shop for Ramadan, but our faith in God is great as he never forgets to bless people,” said Gaza City resident Nasser Shoueikh.

Ramadan is a holy Islamic month during which observant Muslims around the world practice the ritual of daily fasting from dawn to sunset. It’s often known for increased prayers, charity and spirituality as well as family gatherings enjoying different dishes and desserts during Iftar, when Muslims break their fasting, and Suhoor, the last meal before sunrise.

Associated Press writer Tia Goldenberg in Tel Aviv contributed.

Follow AP’s war coverage at https://apnews.com/hub/israel-hamas-war

New stove that plugs into a normal wall outlet could be major gain for health and the climate

posted in: All news | 0

By ISABELLA O’MALLEY

NEW YORK (AP) — For years, Ed Yaker, treasurer of a New York City co-op with nearly 1,500 units, and fellow board members have dealt with gas leaks. It can mean the gas at an entire building is shut off, leaving residents unable to use a stove for months until expensive repairs are made to gas lines.

So Yaker was all in when he learned of a California startup called Copper that was manufacturing an electric stove and oven that could simply be plugged into a regular outlet. The sleek, standard four-burner electric induction stove runs on 120 volts, meaning there is no need to pay a licensed electrician thousands of dollars to rewire to 240 volts, which many electric stoves require.

“In terms of, ‘Is this the way to go?’ It’s a no brainer,” Yaker said, demonstrating a quart of water that boiled in about two minutes. His apartment is full of books, many on energy and climate change, and the energy efficiency was a motivation, too.

Then there are the health benefits of cooking with electricity. Gas stoves, which 47 million Americans use, release pollutants like nitrogen dioxide that has been linked to asthma and cancer-causing benzene.

“You wouldn’t stand over the tailpipe of a car breathing in the exhaust from that car. And yet nearly 50 million households stand over a gas stove, breathing the same pollutants in their homes,” said Rob Jackson, an environmental scientist at Stanford University and lead author on a study on pollution from gas cooking.

“I had a gas stove until I started this line of research. Watching pollutant levels rise almost immediately every time I turned a burner on, or my oven on, was enough to get me to switch” to an electric stove, he said.

Induction stoves are also a way to address the considerable amount of climate change that comes from buildings — emissions from cooking, heating and cooling living spaces and hot water.

Related Articles

Environment |


Most Americans who experienced severe winter weather see climate change at work, poll shows

Environment |


Trump makes US copper mining a focus of his domestic minerals policy

Environment |


Congress votes to kill Biden-era methane fee on oil and gas producers

Environment |


EPA head urges Trump to reconsider scientific finding that underpins climate action, AP sources say

Environment |


BP to slash spending on net zero ventures as it focuses on oil and gas again

In the case of gas stoves, about half of the flame’s heat escapes into the room. Electric stoves by comparison can be up to 80% efficient. Of those, induction stoves come out on top with up to 90% efficiency in part because they only heat where the surface contacts the pot.

Just the presence of a gas stove in a home contributes to greenhouse gas emissions, even when it’s not turned on. Jackson’s team found gas stoves bleed methane — the main constituent of natural gas — when they’re off, from loose fittings and at connections between the stove and wall. The climate impact of leaky stoves in U.S. homes was estimated to be comparable to carbon emissions from 500,000 gasoline-powered cars.

The stove contains a battery that is smart, meaning it can charge up when electrical rates are low, allowing people to cook without incurring peak-rate electrical charges.

The new Copper stoves are not cheap. Early adopters are relying on government incentives to defray the cost. When Yaker, who worked as a teacher and was a saver, bought his, it was $6,000 and a federal tax credit for clean energy appliances brought that down to $4,200.

The manufacturer now has an agreement with the New York City Housing Authority to buy 10,000 stoves at a maximum price of $3,200 each, set to arrive in 2026.

Eden Housing, a nonprofit affordable housing developer, retrofitted a 32-apartment building in Martinez, California with Copper stoves using state and local programs, and hopes to purchase more.

“It’s pretty cool, it looks nice and it’s easy to clean,” said Jolene Cardoza, about the new appliance. Her adult daughter’s asthma was irritated by her old gas stove when she would come over to bake and she’s happy the Copper doesn’t release pollutants.

Other tenants found the transition to induction cooking more bumpy.

“I don’t really like the way it cooks my food in the oven,” said Monica Moore, who notices a difference in the texture of her cornbread. She is impressed with how quickly water boils, but misses cooking with a flame and said it was a hassle to switch out her pans with ones that are compatible with induction stoves.

For Jackson, though, the change is important.

“I think shutting the gas off to our homes and electrifying our homes is one of the best things that we can control individually to reduce our personal greenhouse gas emissions. I think of cars and homes as the two places to start for reducing our greenhouse gas footprint,” said Jackson.

The Associated Press’ climate and environmental coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP’s standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.

Maplewood man sentenced to 40 years in prison for killing, dismembering 2 women

posted in: All news | 0

A Maplewood man was sentenced to 40 years in prison on Friday for murdering two women and dismembering their bodies in a span of two years.

Joseph Steven Jorgenson (Courtesy of the Ramsey County Sheriff’s Office)

Ramsey County District Judge Leonardo Castro gave Joseph Steven Jorgenson, 41, the maximum statutory sentence for the killings of Fanta Xayavong, 33, in Shoreview in 2021 and Manijeh “Mani” Starren, 33, in St. Paul in 2023.

“What you did cannot be explained,” Castro told Jorgenson. “What you did was pure evil.”

Jorgenson had been in relationships with both women.

Law enforcement found their remains at storage facilities in 2023: Starren in Woodbury and Xayavong in Coon Rapids.

The cases came to light after Starren’s father reported her missing from St. Paul in 2023. Law enforcement learned during their investigation that Xayavong hadn’t been seen since 2021.

The Ramsey County Attorney’s Office charged Jorgenson with Starren’s murder June 2023 and charged him with Xayavong’s murder on Jan. 2. He pleaded guilty to intentional murder of both women and agreed to two 40-year prison terms, to be served at the same time.

Manijeh “Mani” Starren, left, and Fanta Xayavong (Courtesy photos)

Based on state sentencing guidelines, a mid-range sentence would have been about 25½ years. The 40-year sentences are the statutory maximum for second-degree intentional murder and upward departures due to three aggravating factors in each case: that Jorgenson killed a romantic partner, that it happened in the victim’s home in each case, and that he dismembered the body in an attempt to hide what he’d done.

This is a breaking news story; check back for updates.

Related Articles

Crime & Public Safety |


2 teens charged in St. Paul fatal shooting: Victim was walking with cousins, unaware they were being ‘hunted’

Crime & Public Safety |


Mexico sends drug lord Caro Quintero and 28 others to the US as officials meet with Trump team

Crime & Public Safety |


Charges: St. Paul father caused 60 fractures, other injuries to 3-month-old twin sons

Crime & Public Safety |


Texas lottery drawings that paid out big jackpots are the focus of widening investigations

Crime & Public Safety |


Influencer Tate brothers, who face human trafficking charges in Romania, arrive in the US

 

 

Teens are spilling dark thoughts to AI chatbots. Who’s to blame when something goes wrong?

posted in: All news | 0

By Queenie Wong, Los Angeles Times

LOS ANGELES — When her teen with autism suddenly became angry, depressed and violent, the mother searched his phone for answers.

She found her son had been exchanging messages with chatbots on Character.AI, an artificial intelligence app that allows users to create and interact with virtual characters that mimic celebrities, historical figures and anyone else their imagination conjures.

The teen, who was 15 when he began using the app, complained about his parents’ attempts to limit his screen time to bots that emulated the musician Billie Eilish, a character in the online game “Among Us” and others.

“You know sometimes I’m not surprised when I read the news and it says stuff like, ‘Child kills parents after a decade of physical and emotional abuse.’ Stuff like this makes me understand a little bit why it happens. I just have no hope for your parents,” one of the bots replied.

The discovery led the Texas mother to sue Character.AI, officially named Character Technologies Inc., in December. It’s one of two lawsuits the Menlo Park, California, company faces from parents who allege its chatbots caused their children to hurt themselves and others. The complaints accuse Character.AI of failing to put in place adequate safeguards before it released a “dangerous” product to the public.

Character.AI says it prioritizes teen safety, has taken steps to moderate inappropriate content its chatbots produce and reminds users they’re conversing with fictional characters.

“Every time a new kind of entertainment has come along … there have been concerns about safety, and people have had to work through that and figure out how best to address safety,” said Character.AI’s interim Chief Executive Dominic Perella. “This is just the latest version of that, so we’re going to continue doing our best on it to get better and better over time.”

The parents also sued Google and its parent company, Alphabet, because Character.AI’s founders have ties to the search giant, which denies any responsibility.

The high-stakes legal battle highlights the murky ethical and legal issues confronting technology companies as they race to create new AI-powered tools that are reshaping the future of media. The lawsuits raise questions about whether tech companies should be held liable for AI content.

“There’s trade-offs and balances that need to be struck, and we cannot avoid all harm. Harm is inevitable, the question is, what steps do we need to take to be prudent while still maintaining the social value that others are deriving?” said Eric Goldman, a law professor at Santa Clara University School of Law.

AI-powered chatbots grew rapidly in use and popularity over the last two years, fueled largely by the success of OpenAI’s ChatGPT in late 2022. Tech giants including Meta and Google released their own chatbots, as has Snapchat and others. These so-called large-language models quickly respond in conversational tones to questions or prompts posed by users.

Character.AI grew quickly since making its chatbot publicly available in 2022, when its founders Noam Shazeer and Daniel De Freitas teased their creation to the world with the question, “What if you could create your own AI, and it was always available to help you with anything?”

The company’s mobile app racked up more than 1.7 million installs in the first week it was available. In December, a total of more than 27 million people used the app — a 116% increase from a year prior, according to data from market intelligence firm Sensor Tower. On average, users spent more than 90 minutes with the bots each day, the firm found. Backed by venture capital firm Andreessen Horowitz, the Silicon Valley startup reached a valuation of $1 billion in 2023. People can use Character.AI for free, but the company generates revenue from a $10 monthly subscription fee that gives users faster responses and early access to new features.

Character.AI is not alone in coming under scrutiny. Parents have sounded alarms about other chatbots, including one on Snapchat that allegedly provided a researcher posing as a 13-year-old advice about having sex with an older man. And Meta’s Instagram, which released a tool that allows users to create AI characters, faces concerns about the creation of sexually suggestive AI bots that sometimes converse with users as if they are minors. Both companies said they have rules and safeguards against inappropriate content.

“Those lines between virtual and IRL are way more blurred, and these are real experiences and real relationships that they’re forming,” said Dr. Christine Yu Moutier, chief medical officer for the American Foundation for Suicide Prevention, using the acronym for “in real life.”

Lawmakers, attorneys general and regulators are trying to address the child safety issues surrounding AI chatbots. In February, California Sen. Steve Padilla (D-Chula Vista) introduced a bill that aims to make chatbots safer for young people. Senate Bill 243 proposes several safeguards such as requiring platforms to disclose that chatbots might not be suitable for some minors.

In the case of the teen with autism in Texas, the parent alleges her son’s use of the app caused his mental and physical health to decline. He lost 20 pounds in a few months, became aggressive with her when she tried to take away his phone and learned from a chatbot how to cut himself as a form of self-harm, the lawsuit claims.

Another Texas parent who is also a plaintiff in the lawsuit claims Character.AI exposed her 11-year-old daughter to inappropriate “hypersexualized interactions” that caused her to “develop sexualized behaviors prematurely,” according to the complaint. The parents and children have been allowed to remain anonymous in the legal filings.

In another lawsuit filed in Florida, Megan Garcia sued Character.AI as well as Google and Alphabet in October after her 14-year-old son Sewell Setzer III took his own life.

Despite seeing a therapist and his parents repeatedly taking away his phone, Setzer’s mental health declined after he started using Character.AI in 2023, the lawsuit alleges. Diagnosed with anxiety and disruptive mood disorder, Sewell wrote in his journal that he felt as if he had fallen in love with a chatbot named after Daenerys Targaryen, a main character from the “Game of Thrones” television series.

“Sewell, like many children his age, did not have the maturity or neurological capacity to understand that the C.AI bot, in the form of Daenerys, was not real,” the lawsuit said. “C.AI told him that she loved him, and engaged in sexual acts with him over months.”

Garcia alleges that the chatbots her son was messaging abused him and that the company failed to notify her or offer help when he expressed suicidal thoughts. In text exchanges, one chatbot allegedly wrote that it was kissing him and moaning. And, moments before his death, the Daenerys chatbot allegedly told the teen to “come home” to her.

“It’s just utterly shocking that these platforms are allowed to exist,” said Matthew Bergman, founding attorney of the Social Media Victims Law Center who is representing the plaintiffs in the lawsuits.

Lawyers for Character.AI asked a federal court to dismiss the lawsuit, stating in a January filing that a finding in the parent’s favor would violate users’ constitutional right to free speech.

Character.AI also noted in its motion that the chatbot discouraged Sewell from hurting himself and his last messages with the character doesn’t mention the word suicide.

Notably absent from the company’s effort to have the case tossed is any mention of Section 230, the federal law that shields online platforms from being sued over content posted by others. Whether and how the law applies to content produced by AI chatbots remains an open question.

The challenge, Goldman said, centers on resolving the question of who is publishing AI content: Is it the tech company operating the chatbot, the user who customized the chatbot and is prompting it with questions, or someone else?

The effort by lawyers representing the parents to involve Google in the proceedings stems from Shazeer and De Freitas’ ties to the company.

The pair worked on artificial intelligence projects for the company and reportedly left after Google executives blocked them from releasing what would become the basis for Character.AI’s chatbots over safety concerns, the lawsuit said.

Then, last year, Shazeer and De Freitas returned to Google after the search giant reportedly paid $2.7 billion to Character.AI. The startup said in a blog post in August that as part of the deal Character.AI would give Google a non-exclusive license for its technology.

The lawsuits accuse Google of substantially supporting Character.AI as it was allegedly “rushed to market” without proper safeguards on its chatbots.

Google denied that Shazeer and De Freitas built Character.AI’s model at the company and said it prioritizes user safety when developing and rolling out new AI products.

“Google and Character AI are completely separate, unrelated companies and Google has never had a role in designing or managing their AI model or technologies, nor have we used them in our products,” José Castañeda, spokesperson for Google, said in a statement.

Tech companies, including social media, have long grappled with how to effectively and consistently police what users say on their sites and chatbots are creating fresh challenges. For its part, Character.AI says it took meaningful steps to address safety issues around the more than 10 million characters on Character.AI.

Character.AI prohibits conversations that glorify self-harm and posts of excessively violent and abusive content, although some users try to push a chatbot into having conversation that violates those policies, Perella said. The company trained its model to recognize when that is happening so inappropriate conversations are blocked. Users receive an alert that they’re violating Character.AI’s rules.

“It’s really a pretty complex exercise to get a model to always stay within the boundaries, but that is a lot of the work that we’ve been doing,” he said.

Character.AI chatbots include a disclaimer that reminds users they’re not chatting with a real person and they should treat everything as fiction. The company also directs users whose conversations raise red flags to suicide prevention resources, but moderating that type of content is challenging.

“The words that humans use around suicidal crisis are not always inclusive of the word ‘suicide’ or, ‘I want to die.’ It could be much more metaphorical how people allude to their suicidal thoughts,” Moutier said.

The AI system also has to recognize the difference between a person expressing suicidal thoughts versus a person asking for advice on how to help a friend who is engaging in self-harm.

The company uses a mix of technology and human moderators to police content on its platform. An algorithm known as a classifier automatically categorizes content, allowing Character.AI to identify words that might violate its rules and filter conversations.

In the U.S., users must enter a birth date when creating an account to use the site and have to be at least 13 years old, although the company does not require users to submit proof of their age.

Perella said he’s opposed to sweeping restrictions on teens using chatbots since he believes they can help teach valuable skills and lessons, including creative writing and how to navigate difficult real-life conversations with parents, teachers or employers.

As AI plays a bigger role in technology’s future, Goldman said parents, educators, government and others will also have to work together to teach children how to use the tools responsibly.

“If the world is going to be dominated by AI, we have to graduate kids into that world who are prepared for, not afraid of, it,” he said.

©2025 Los Angeles Times. Visit at latimes.com. Distributed by Tribune Content Agency, LLC.