Browsing by Author "Guessoum, Zahia"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
- Optimizing Olive Disease Classification Through Hybrid Machine Learning and Deep Learning TechniquesPublication . Mendes, João; Moso, Juliet; Berger, Guido; Lima, José; Costa, Lino; Guessoum, Zahia; Pereira, Ana I.Olive trees play a crucial role in the global agricultural landscape, serving as a primary source of olive oil production. However, olive trees are susceptible to several diseases, which can significantly impact yield and quality. This study addresses the challenge of improving the diagnosis of diseases in olive trees, specifically focusing on aculus olearius and Olive Peacock Spot diseases. Using a novel hybrid approach that combines deep learning and machine learning methodologies, the authors aimed to optimize disease classification accuracy by analyzing images of olive leaves. The presented methodology integrates Local Binary Patterns (LBP) and an adapted ResNet50 model for feature extraction, followed by classification through optimized machine learning models, including Stochastic Gradient Descent (SGD), Support Vector Machine (SVM), and Random Forest (RF). The results demonstrated that the hybrid model achieved a groundbreaking accuracy of 99.11%, outperforming existing models. This advancement underscores the potential of integrated technological approaches in agricultural disease management and sets a new benchmark for the early and accurate detection of foliar diseases.
- XAI Framework for Fall Detection in an AAL SystemPublication . Messaoudi, Chaima; Kalbermatter, Rebeca B.; Lima, José; Pereira, Ana I.; Guessoum, Zahia; Kalbermatter, Rebeca B.The Ambient Assisted Living (AAL) systems are humancentered and designed to prioritize the needs of elderly individuals, providing them with assistance in case of emergencies or unexpected situations. These systems involve caregivers or selected individuals who can be alerted and provide the necessary help when needed. To ensure effective assistance, it is crucial for caregivers to understand the reasons behind alarm triggers and the nature of the danger. This is where an explainability module comes into play. In this paper, we introduce an explainability module that offers visual explanations for the fall detection module. Our framework involves generating anchor boxes using the K-means algorithm to optimize object detection and using YOLOv8 for image inference. Additionally, we employ two well-known XAI (Explainable Artificial Intelligence) algorithms, LIME (Local Interpretable Model) and Grad-CAM (Gradient-weighted Class Activation Mapping), to provide visual explanations.
