Show simple item record

dc.contributor.authorElKafrawy, Passent
dc.contributor.authorMoharram, Hassan
dc.contributor.authorAwad, Ahmed
dc.date.accessioned2023-03-16T04:56:35Z
dc.date.available2023-03-16T04:56:35Z
dc.date.issued2022-05-22
dc.identifier.citationHassan Moharram, Ahmed Awad, and Passent M. El-Kafrawy. 2022. Optimizing ADWIN for steady streams. In Proceedings of the 37th ACM/SIGAPP Symposium on Applied Computing (SAC '22). Association for Computing Machinery, New York, NY, USA, 450–459. https://doi.org/10.1145/3477314.3507074en_US
dc.identifier.doihttps://doi.org/10.1145/3477314.3507074en_US
dc.identifier.urihttp://hdl.handle.net/20.500.14131/686
dc.description.abstractWith the ever-growing data generation rates and stringent constraints on the latency of analyzing such data, stream analytics is overtaking. Learning from data streams, aka online machine learning, is no exception. However, online machine learning comes with many challenges for the different aspects of the learning process, starting from the algorithm design to the evaluation method. One of these challenges is the ability of a learning system to adapt to the change in data distribution, known as concept drift, to maintain the accuracy of the predictions. Over time, several drift detection approaches have been proposed. A prominent approach is adaptive windowing (ADWIN) which can detect changes in features data distribution without explicit feedback on the correctness of the prediction. Several variants for ADWIN have been proposed to enhance its runtime performance, w.r.t throughput, and latency. However, the drift detection accuracy of these variants was not compared with the original algorithm. Moreover, there is no study concerning the memory consumption of the variants and the original algorithm. Additionally, the evaluation was done on synthetic datasets with a considerable number of drifts not covering all types of drifts or steady streams, those that do not have drifts at all or almost negligible drifts. The contribution of this paper is two-fold. First, we compare the original Adaptive Window (ADWIN) and its variants: Serial, HalfCut, and Optimistic in terms of drift detection accuracy, detection speed, and memory consumption, represented in the internal window size. We compare them using synthetic data sets covering different types of concept drifts, namely: incremental, gradual, abrupt, and steady. We also use two real-life datasets whose drifts are unknown. Second, we present ADWIN++. We use an adaptive bucket dropping technique to control window size. We evaluate our technique on the same data sets above and new datasets with fewer drifts. Experiments show that our approach saves about 80% of memory consumption. Moreover, it takes less time to detect concept drift and maintains the drift detection accuracy.en_US
dc.subjectOptimizing ADWIN for steady streamsen_US
dc.subjectOnline machine learningen_US
dc.subjectConcept driftsen_US
dc.subjectSteady streamsen_US
dc.titleOptimizing ADWIN for steady streamsen_US
dc.contributor.researcherExternal Collaborationen_US
dc.contributor.labArtificial Intelligence & Cyber Security Laben_US
dc.subject.KSAICTen_US
dc.source.indexScopusen_US
dc.contributor.departmentComputer Scienceen_US
dc.contributor.firstauthorMoharram, Hassan
dc.conference.locationSAC '22: Proceedings of the 37th ACM/SIGAPP Symposium on Applied Computingen_US
dc.conference.date2022-04


This item appears in the following Collection(s)

Show simple item record